mlflow.azureml
The mlflow.azureml
module provides an API for deploying MLflow models to Azure
Machine Learning.
-
mlflow.azureml.
build_image
(model_uri, workspace, image_name=None, model_name=None, mlflow_home=None, description=None, tags=None, synchronous=True)[source] Note
Experimental: This method may change or be removed in a future release without warning.
Register an MLflow model with Azure ML and build an Azure ML ContainerImage for deployment. The resulting image can be deployed as a web service to Azure Container Instances (ACI) or Azure Kubernetes Service (AKS).
The resulting Azure ML ContainerImage will contain a webserver that processes model queries. For information about the input data formats accepted by this webserver, see the MLflow deployment tools documentation.
- Parameters
model_uri –
The location, in URI format, of the MLflow model used to build the Azure ML deployment image. For example:
/Users/me/path/to/local/model
relative/path/to/local/model
s3://my_bucket/path/to/model
runs:/<mlflow_run_id>/run-relative/path/to/model
models:/<model_name>/<model_version>
models:/<model_name>/<stage>
For more information about supported URI schemes, see Referencing Artifacts.
image_name – The name to assign the Azure Container Image that will be created. If unspecified, a unique image name will be generated.
model_name – The name to assign the Azure Model will be created. If unspecified, a unique model name will be generated.
workspace – The AzureML workspace in which to build the image. This is a azureml.core.Workspace object.
mlflow_home – Path to a local copy of the MLflow GitHub repository. If specified, the image will install MLflow from this directory. Otherwise, it will install MLflow from pip.
description – A string description to associate with the Azure Container Image and the Azure Model that will be created. For more information, see https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.image.container.containerimageconfig?view=azure-ml-py and https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#register.
tags –
A collection of tags, represented as a dictionary of string key-value pairs, to associate with the Azure Container Image and the Azure Model that will be created. These tags are added to a set of default tags that include the model uri, and more. For more information, see https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.image.container.containerimageconfig?view-azure-ml-py and https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#register.
synchronous – If
True
, this method blocks until the image creation procedure terminates before returning. IfFalse
, the method returns immediately, but the returned image will not be available until the asynchronous creation process completes. Use theazureml.core.Image.wait_for_creation()
function to wait for the creation process to complete.
- Returns
A tuple containing the following elements in order: - An
azureml.core.image.ContainerImage
object containing metadata for the new image. - Anazureml.core.model.Model
object containing metadata for the new model.
import mlflow.azureml from azureml.core import Workspace from azureml.core.webservice import AciWebservice, Webservice # Load or create an Azure ML Workspace workspace_name = "<Name of your Azure ML workspace>" subscription_id = "<Your Azure subscription ID>" resource_group = "<Name of the Azure resource group in which to create Azure ML resources>" location = "<Name of the Azure location (region) in which to create Azure ML resources>" azure_workspace = Workspace.create(name=workspace_name, subscription_id=subscription_id, resource_group=resource_group, location=location, create_resource_group=True, exist_ok=True) # Build an Azure ML Container Image for an MLflow model azure_image, azure_model = mlflow.azureml.build_image(model_uri="<model_uri>", workspace=azure_workspace, synchronous=True) # If your image build failed, you can access build logs at the following URI: print("Access the following URI for build logs: {}".format(azure_image.image_build_log_uri)) # Deploy the image to Azure Container Instances (ACI) for real-time serving webservice_deployment_config = AciWebservice.deploy_configuration() webservice = Webservice.deploy_from_image( image=azure_image, workspace=azure_workspace, name="<deployment-name>") webservice.wait_for_deployment()
-
mlflow.azureml.
deploy
(model_uri, workspace, deployment_config=None, service_name=None, model_name=None, tags=None, mlflow_home=None, synchronous=True)[source] Note
Experimental: This method may change or be removed in a future release without warning.
Register an MLflow model with Azure ML and deploy a websevice to Azure Container Instances (ACI) or Azure Kubernetes Service (AKS).
The deployed service will contain a webserver that processes model queries. For information about the input data formats accepted by this webserver, see the MLflow deployment tools documentation.
- Parameters
model_uri –
The location, in URI format, of the MLflow model used to build the Azure ML deployment image. For example:
/Users/me/path/to/local/model
relative/path/to/local/model
s3://my_bucket/path/to/model
runs:/<mlflow_run_id>/run-relative/path/to/model
models:/<model_name>/<model_version>
models:/<model_name>/<stage>
For more information about supported URI schemes, see Referencing Artifacts.
workspace – The AzureML workspace in which to deploy the service. This is a azureml.core.Workspace object.
deployment_config – The configuration for the Azure web service. This configuration allows you to specify the resources the webservice will use and the compute cluster it will be deployed in. If unspecified, the web service will be deployed into a Azure Container Instance. This is a azureml.core.DeploymentConfig object. For more information, see https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aks.aksservicedeploymentconfiguration and https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.webservice.aci.aciservicedeploymentconfiguration
service_name – The name to assign the Azure Machine learning webservice that will be created. If unspecified, a unique name will be generated.
model_name – The name to assign the Azure Model will be created. If unspecified, a unique model name will be generated.
tags – A collection of tags, represented as a dictionary of string key-value pairs, to associate with the Azure Model and Deployment that will be created. These tags are added to a set of default tags that include the model uri, and more. For more information, see https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model(class)?view=azure-ml-py.
mlflow_home – Path to a local copy of the MLflow GitHub repository. If specified, the image will install MLflow from this directory. Otherwise, it will install MLflow from pip.
synchronous – If
True
, this method blocks until the image creation procedure terminates before returning. IfFalse
, the method returns immediately, but the returned image will not be available until the asynchronous creation process completes. Use theazureml.core.Webservice.wait_for_deployment()
function to wait for the deployment process to complete.
- Returns
A tuple containing the following elements in order: - An
azureml.core.webservice.Webservice
object containing metadata for the new service. - Anazureml.core.model.Model
object containing metadata for the new model.
import mlflow.azureml from azureml.core import Workspace from azureml.core.webservice import AciWebservice, Webservice # Load or create an Azure ML Workspace workspace_name = "<Name of your Azure ML workspace>" subscription_id = "<Your Azure subscription ID>" resource_group = "<Name of the Azure resource group in which to create Azure ML resources>" location = "<Name of the Azure location (region) in which to create Azure ML resources>" azure_workspace = Workspace.create(name=workspace_name, subscription_id=subscription_id, resource_group=resource_group, location=location, create_resource_group=True, exist_ok=True) # Create an Azure Container Instance webservice for an MLflow model azure_service, azure_model = mlflow.azureml.deploy(model_uri="<model_uri>", service_name="<deployment-name>", workspace=azure_workspace, synchronous=True)