GithubHelp home page GithubHelp logo

gvanderberg / tf_azure_ml Goto Github PK

View Code? Open in Web Editor NEW
0.0 2.0 1.0 101 KB

Terraform scripts to create azure machine learning services

License: MIT License

HCL 100.00%
azure terraform terraform-azure azure-machine-learning azure-key-vault azure-storage-account azure-virtual-networks

tf_azure_ml's Introduction

Azure Machine Learning

Disable network policies for private endpoints

az network vnet subnet update --name default --resource-group myResourceGroup --vnet-name myVirtualNetwork --disable-private-endpoint-network-policies true

Add a private endpoint to a workspace

az ml workspace private-endpoint add --resource-group myWSResourceGroup --workspace-name myWorkspaceName --pe-name myPrivateEndpoint --pe-vnet-name myVirtualNetwork --pe-subnet-name mySubnet --pe-resource-group myVNResourceGroup

Internal AKS load balancer

az ml computetarget update aks --name myInferenceCluster --load-balancer-subnet mySubnet --load-balancer-type InternalLoadBalancer --workspace myWorkspaceName --resource-group myResourceGroup

Connection information

If you know the name of the deployed service, you can create a new instance of Webservice, and provide the workspace and service name as parameters. The new object contains information about the deployed service.:

service = Webservice(workspace='myWorkspaceName', name='myServiceName')
print(service.scoring_uri)
print(service.swagger_uri)

If you know the name of the deployed service, use the az ml service show command:

az ml service show --name myServiceName --resource-group myWSResourceGroup --workspace-name myWorkspaceName

Authentication with keys

When you enable authentication for a deployment, you automatically create authentication keys.

  • Authentication is enabled by default when you are deploying to Azure Kubernetes Service.
  • Authentication is disabled by default when you are deploying to Azure Container Instances.

If authentication is enabled, you can use the get-keys method to retrieve a primary and secondary authentication key:

primary, secondary = service.get_keys()
print(primary)
az ml service get-keys --name myServiceName --resource-group myWSResourceGroup --workspace-name myWorkspaceName

If you need to regenerate a key, use regen-key.

Authentication with tokens

When you enable token authentication for a web service, a user must provide an Azure Machine Learning JWT token to the web service to access it.

  • Token authentication is disabled by default when you are deploying to Azure Kubernetes Service.
  • Token authentication is not supported when you are deploying to Azure Container Instances.

If token authentication is enabled, you can use the get-access-token method to retrieve a bearer token and that tokens expiration time:

token, refresh_by = service.get_token()
print(token)
az ml service get-access-token --name myServiceName --resource-group myWSResourceGroup --workspace-name myWorkspaceName

Currently the only way to retrieve the token is by using the Azure Machine Learning SDK or the Azure CLI machine learning extension.

Troubleshooting tips

Most users are able to resolve issues concerning consuming endpoints by using the following steps.

Recommended Steps

  1. See Consume an Azure Machine Learning model deployed as a web service. Deploying an Azure Machine Learning model as a web service creates a REST API endpoint. You can send data to this endpoint and receive the prediction returned by the model. The linked article will show you how to create clients for the web service by using C#, Go, Java, and Python.
  2. See Consume the service from Power BI. After the web service is deployed, it's consumable from Power BI dataflows. Learn how to consume an Azure Machine Learning web service from Power BI.
  3. How to update a deployed web service
  4. See Advance Entry Script Authoring. This article shows how to write entry scripts for specialized use cases:
  1. The InferenceSchema Python package provides uniform schema for common machine learning applications, as well as a set of decorators that can be used to aid in web-based ML prediction applications.
  2. Troubleshoot a failed deployment
  • Debug Locally
  • HTTP status code 502
  • HTTP status code 503
  • HTTP status code 504
  • Container cannot be scheduled error. When deploying a service to an Azure Kubernetes Service compute target, Azure Machine Learning will attempt to schedule the service with the requested amount of resources. If there are no nodes available in the cluster with appropriate amount of resources after 5 minutes, the deployment will fail.
  • Service launch fails. As part of container starting-up process, the init() function in your scoring script is invoked by the system. If there are uncaught error exceptions in the init() function, you might see CrashLoopBackOff error in the error message.
  • Function fails: get_model_path(). Often, in the init() function in the scoring script, Model.get_model_path() function is called to locate a model file or folder of model files in the container. If the model file or folder cannot be found, the function fails.
  • Function fails: run(input_data). If service is successfully deployed, but it crashes when you post data to the scoring endpoint, you can add error catching statement in your run(input_data).
  • Advance Debugging. You may need to interactively debug the Python code contained in your model deployment. By using Visual Studio Code and the debugpy, you can attach to the code running inside Docker container.
  • Webservices in Azure Kubernetes Service Failures. Many webservice failures in Azure Kubernetes Service can be debugged by connecting to the cluster using kubectl.
  1. Collect and evaluate model data
  2. Monitor and collect data from ML web service endpoints
  3. Limitations when deploying model to Azure Container Instances.
  4. Secure an Azure Machine Learning Inferencing environment with virtual networks

Recommended Documents

Python SDK

Enterprise Readiness and Security

tf_azure_ml's People

Contributors

gvanderberg avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.