mlflow.onnx
The mlflow.onnx module provides APIs for logging and loading ONNX models in the MLflow Model
format. This module exports MLflow Models with the following flavors:
- ONNX (native) format
This is the main flavor that can be loaded back as an ONNX model object.
mlflow.pyfuncProduced for use by generic pyfunc-based deployment tools and batch inference.
-
mlflow.onnx.get_default_conda_env()[source] Note
Experimental: This method may change or be removed in a future release without warning.
- Returns
The default Conda environment for MLflow Models produced by calls to
save_model()andlog_model().
-
mlflow.onnx.load_model(model_uri)[source] Note
Experimental: This method may change or be removed in a future release without warning.
Load an ONNX model from a local file or a run.
- Parameters
model_uri –
The location, in URI format, of the MLflow model, for example:
/Users/me/path/to/local/modelrelative/path/to/local/models3://my_bucket/path/to/modelruns:/<mlflow_run_id>/run-relative/path/to/modelmodels:/<model_name>/<model_version>models:/<model_name>/<stage>
For more information about supported URI schemes, see the Artifacts Documentation.
- Returns
An ONNX model instance.
-
mlflow.onnx.log_model(onnx_model, artifact_path, conda_env=None, registered_model_name=None)[source] Note
Experimental: This method may change or be removed in a future release without warning.
Log an ONNX model as an MLflow artifact for the current run.
- Parameters
onnx_model – ONNX model to be saved.
artifact_path – Run-relative artifact path.
conda_env –
Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this decsribes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env(). If None, the defaultget_default_conda_env()environment is added to the model. The following is an example dictionary representation of a Conda environment:{ 'name': 'mlflow-env', 'channels': ['defaults'], 'dependencies': [ 'python=3.6.0', 'onnx=1.4.1', 'onnxruntime=0.3.0' ] }
registered_model_name – Note:: Experimental: This argument may change or be removed in a future release without warning. If given, create a model version under
registered_model_name, also creating a registered model if one with the given name does not exist.
-
mlflow.onnx.save_model(onnx_model, path, conda_env=None, mlflow_model=<mlflow.models.Model object>)[source] Note
Experimental: This method may change or be removed in a future release without warning.
Save an ONNX model to a path on the local file system.
- Parameters
onnx_model – ONNX model to be saved.
path – Local path where the model is to be saved.
conda_env –
Either a dictionary representation of a Conda environment or the path to a Conda environment yaml file. If provided, this describes the environment this model should be run in. At minimum, it should specify the dependencies contained in
get_default_conda_env(). If None, the defaultget_default_conda_env()environment is added to the model. The following is an example dictionary representation of a Conda environment:{ 'name': 'mlflow-env', 'channels': ['defaults'], 'dependencies': [ 'python=3.6.0', 'onnx=1.4.1', 'onnxruntime=0.3.0' ] }
mlflow_model –
mlflow.models.Modelthis flavor is being added to.