Skip to main content

Python Machine Learning In Azure ML Studio

Just quick post to show you how easy it is to migrate Jupyter notebooks into azure machine learning studio.   For this walkthrough we will use the preview of the latest release of ML Studio (https://ml.azure.com/).   Microsoft has been hard at work making their Machine Learning Studio easier to use and the preview release has simplified several processes.
When you bring up Azure ML Studio Importing a Jupyter notebook is a straight forward process.   Once you’ve loaded studio select notebooks and then select the upload files link. Once you have uploaded the file you’ll be able to select it and view it but no run it yet.



We can’t run the uploaded notebook yet because we need to setup a VM that will be used process our data. Allowing you to control the size of the VM that your notebook will run on lets you control costs associated with housing the notebook in Azure.  Additionally, you can shut down the VM when not in use to further reduce costs.   If you click the “Setup VM” link the dialog below shows up.  The drop down allows you to select the size of VM.  For purposes of the demo select the smallest VM. 





Once your VM starts the imported Notebook will then be editable and runnable on your new VM.

The editing/debug experience is very simliar to the one that you would find in Anaconda. 

Create Dataset
With a Jupyter notepad running in Anaconda you have access to the file system to load models as needed.  In order to get our notebook to run in Azure we need to create a dataset and reference it to load our model.   This is a straight forward process.  To start with we select create from the dataset menu.
We want to create our dataset in this case from a local file so after selecting that option the dialog below is shown. 







If you have already created a datastore you can select it here or you can create a new one.  Next select the file(s) that you want to import as your dataset.  Note by importing the files you are moving the model into the cloud.

 

Once you have imported your model the view below is displayed.  This shows the basic information about your new Dataset.   The most important part of this dialog is on the left.  It shows the changes you need to make to your notebook to access the model.  



A blow up of that section is shown.  This code would be used to replace any place where you load the model from a file as below.   However, the replacement code is almost a direct cut and paste with a possibly needing to rename the variable used in the azure sample code. 



Replaced with snippet below in your notebook.



 That’s it, you’ve now successfully moved a Notebook onto the Azure Platform.  Note to avoid incurring any charges you should shut down your VM after finishing your experiment. 




Comments

Popular posts from this blog

Serverless Azure Functions

Introduction Azure function capabilities are fast evolving.   They have come a long way from their inception just a few years ago.    Originally conceived to be small functions with short execution.   There were no modern architecture capabilities like dependency injection. The functions were also originally limited in the number of parallel instances that were allowed.      From these somewhat humble beginnings Azure functions have quickly matured as a platform.   Today it is possible to run long running methods using durable functions.   Additionally, we now have support for dependency injection using an extension library. Azure Functions are event driven, that is they do not run continuously, but are called in response to an event such as a http request, timer, or an update to a blob storage account.    The function is triggered executes and is then shut down.   The billing for Azure Functions is based on consumption...