Skip to main content

Python Machine Learning In Azure ML Studio

Just quick post to show you how easy it is to migrate Jupyter notebooks into azure machine learning studio.   For this walkthrough we will use the preview of the latest release of ML Studio (https://ml.azure.com/).   Microsoft has been hard at work making their Machine Learning Studio easier to use and the preview release has simplified several processes.
When you bring up Azure ML Studio Importing a Jupyter notebook is a straight forward process.   Once you’ve loaded studio select notebooks and then select the upload files link. Once you have uploaded the file you’ll be able to select it and view it but no run it yet.



We can’t run the uploaded notebook yet because we need to setup a VM that will be used process our data. Allowing you to control the size of the VM that your notebook will run on lets you control costs associated with housing the notebook in Azure.  Additionally, you can shut down the VM when not in use to further reduce costs.   If you click the “Setup VM” link the dialog below shows up.  The drop down allows you to select the size of VM.  For purposes of the demo select the smallest VM. 





Once your VM starts the imported Notebook will then be editable and runnable on your new VM.

The editing/debug experience is very simliar to the one that you would find in Anaconda. 

Create Dataset
With a Jupyter notepad running in Anaconda you have access to the file system to load models as needed.  In order to get our notebook to run in Azure we need to create a dataset and reference it to load our model.   This is a straight forward process.  To start with we select create from the dataset menu.
We want to create our dataset in this case from a local file so after selecting that option the dialog below is shown. 







If you have already created a datastore you can select it here or you can create a new one.  Next select the file(s) that you want to import as your dataset.  Note by importing the files you are moving the model into the cloud.

 

Once you have imported your model the view below is displayed.  This shows the basic information about your new Dataset.   The most important part of this dialog is on the left.  It shows the changes you need to make to your notebook to access the model.  



A blow up of that section is shown.  This code would be used to replace any place where you load the model from a file as below.   However, the replacement code is almost a direct cut and paste with a possibly needing to rename the variable used in the azure sample code. 



Replaced with snippet below in your notebook.



 That’s it, you’ve now successfully moved a Notebook onto the Azure Platform.  Note to avoid incurring any charges you should shut down your VM after finishing your experiment. 




Comments

Popular posts from this blog

Intro to Python Flask Microframework

The ever growing internet of things provides expanding opportunities for software engineers to leverage their skills. If you are not writing software for the devices themselves there are still ample opportunities to build backend systems to capture and analyze the avalanche of data that these devices produce.  In this post we will be looking at creating a backend system for capturing connected device data for a hypothetical IOT device. We will be using Python and a lightweight framework called Flask to create a REST based endpoint that our device calls to submit location data. Our service will store its data in a NOSQL repository powered by Azure Cosmos DB. You can find the repository for this post here . One of the big trends in software engineering is the rising popularity of Python. This multipurpose language has a dominant position in the machine learning space. If you are looking to build some skills in Python but are not currently working on a machine learning project you...