Shared Workspace At a Glance. Whether you take a dedicated desk in an enclosed area or a hot desk in a lively common space, you'll have access to all of our premium amenities. Take a look at what it's like to be in a WeWork space. Using Workspaces.; 2 minutes to read; In this article. When you exit WinDbg, it saves your session configuration in a workspace. A workspace enables you to easily preserve your settings from one session to another. You can also save or clear the workspaces manually, or even use a workspace to save a debugging session that is still in. The new UI updates the WorkSpaces client with a more modern look and feel that includes enhancements to WorkSpaces’ co-branding features, such as HTML support. Finally, this release includes the latest PCoIP protocol updates, enabling an even better user experience.
-->The workspace is the top-level resource for Azure Machine Learning, providing a centralized place to work with all the artifacts you create when you use Azure Machine Learning. The workspace keeps a history of all training runs, including logs, metrics, output, and a snapshot of your scripts. You use this information to determine which training run produces the best model.
Once you have a model you like, you register it with the workspace. You then use the registered model and scoring scripts to deploy to Azure Container Instances, Azure Kubernetes Service, or to a field-programmable gate array (FPGA) as a REST-based HTTP endpoint. You can also deploy the model to an Azure IoT Edge device as a module.
Taxonomy
A taxonomy of the workspace is illustrated in the following diagram: Turntable 3 2.
The diagram shows the following components of a workspace:
- A workspace can contain Azure Machine Learning compute instances, cloud resources configured with the Python environment necessary to run Azure Machine Learning.
- User roles enable you to share your workspace with other users, teams, or projects.
- Compute targets are used to run your experiments.
- When you create the workspace, associated resources are also created for you.
- Experiments are training runs you use to build your models.
- Pipelines are reusable workflows for training and retraining your model.
- Datasets aid in management of the data you use for model training and pipeline creation.
- Once you have a model you want to deploy, you create a registered model.
- Use the registered model and a scoring script to create a deployment endpoint.
Tools for workspace interaction
Workspaces 1 3 2 =
You can interact with your workspace in the following ways:
Important
Tools marked (preview) below are currently in public preview.The preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.For more information, see Supplemental Terms of Use for Microsoft Azure Previews.
- On the web:
- In any Python environment with the Azure Machine Learning SDK for Python.
- In any R environment with the Azure Machine Learning SDK for R (preview).
- On the command line using the Azure Machine Learning CLI extension
![Workspaces 1 3 2 player games Workspaces 1 3 2 player games](https://thumbs.dreamstime.com/z/numbers-1-2-3-one-two-three-16531312.jpg)
Machine learning with a workspace
Machine learning tasks read and/or write artifacts to your workspace.
- Run an experiment to train a model - writes experiment run results to the workspace.
- Use automated ML to train a model - writes training results to the workspace.
- Register a model in the workspace.
- Deploy a model - uses the registered model to create a deployment.
- Create and run reusable workflows.
- View machine learning artifacts such as experiments, pipelines, models, deployments.
- Track and monitor models.
Workspace management
You can also perform the following workspace management tasks:
Workspace management task | Portal | Studio | Python SDK / R SDK | CLI | VS Code |
---|---|---|---|---|---|
Create a workspace | ✓ | ✓ | ✓ | ✓ | |
Manage workspace access | ✓ | ✓ | |||
Create and manage compute resources | ✓ | ✓ | ✓ | ✓ | |
Create a Notebook VM | ✓ |
Warning
Moving your Azure Machine Learning workspace to a different subscription, or moving the owning subscription to a new tenant, is not supported. Doing so may cause errors.
Create a workspace
There are multiple ways to create a workspace:
- Use the Azure portal for a point-and-click interface to walk you through each step.
- Use the Azure Machine Learning SDK for Python to create a workspace on the fly from Python scripts or Jupiter notebooks
- Use an Azure Resource Manager template or the Azure Machine Learning CLI when you need to automate or customize the creation with corporate security standards.
- If you work in Visual Studio Code, use the VS Code extension.
Associated resources
When you create a new workspace, it automatically creates several Azure resources that are used by the workspace:
- Azure Storage account: Is used as the default datastore for the workspace. Jupyter notebooks that are used with your Azure Machine Learning compute instances are stored here as well.ImportantBy default, the storage account is a general-purpose v1 account. You can upgrade this to general-purpose v2 after the workspace has been created.Do not enable hierarchical namespace on the storage account after upgrading to general-purpose v2.To use an existing Azure Storage account, it cannot be a premium account (Premium_LRS and Premium_GRS). It also cannot have a hierarchical namespace (used with Azure Data Lake Storage Gen2). Neither premium storage or hierarchical namespaces are supported with the default Contact alerts for skype 1 1. storage account of the workspace. You can use premium storage or hierarchical namespace with non-default storage accounts.
- Azure Container Registry: Registers docker containers that you use during training and when you deploy a model. To minimize costs, ACR is lazy-loaded until deployment images are created.
- Azure Application Insights: Stores monitoring information about your models.
- Azure Key Vault: Stores secrets that are used by compute targets and other sensitive information that's needed by the workspace.
Note
In addition to creating new versions, you can also use existing Azure services.
What happened to Enterprise edition
As of September 2020, all capabilities that were available in Enterprise edition workspaces are now also available in Basic edition workspaces.New Enterprise workspaces can no longer be created. Any SDK, CLI, or Azure Resource Manager calls that use the
sku
parameter will continue to work but a Basic workspace will be provisioned.Beginning December 21st, all Enterprise Edition workspaces will be automatically set to Basic Edition, which has the same capabilities. No downtime will occur during this process. On January 1, 2021, Enterprise Edition will be formally retired.
In either editions, customers are responsible for the costs of Azure resources consumed and will not need to pay any additional charges for Azure Machine Learning. Please refer to the Azure Machine Learning pricing page for more details.
Next steps
To get started with Azure Machine Learning, see:
Overview¶
For security reasons, Domino Workspace sessions are only accessible onone port. For example, Jupyter typically uses port 8888. When you launcha Jupyter Workspace session, a Domino executor starts the Jupyter serverin a Run,and opens port 8888 to serve the Jupyter application to your browser. Ifyou were to attempt to use the Jupyter terminal to start anotherapplication on a different port, it would not be accessible.
However, in some cases you may want to run multiple interactiveapplications in the same Workspace session. These cases include:
- Editing and debugging Dash or Flask apps live
- Using Tensorboard to view progress of a live training job
Domino 3.5+ supports this with Jupyter Server Proxy andJupyterLab.
Prerequisites
- Python 3+
- Jupyter Server Proxy
Jupyter Server Proxy is installed by default in the latestDomino Standard EnvironmentsTo install it in one of your existing environments, see theinstructions below.
Installing Jupyter Server Proxy in your environment¶
If you are not on the recent version of the Domino Standard Environments,you can install Jupyter Server Proxy in yourDomino environment Nevercenter silo professional 2 5 6. ,by following these steps.
- Add the following lines to your environment’s Dockerfile Instructions.
- Update the JupyterLab definition in the Pluggable Workspace Toolssection of your environment.**
Workspaces 1 3 2 Player Games
Using Jupyter Server Proxy¶
If you launch a JupyterLab Workspace session in an environment withJupyter Server Proxy installed, you can start and serve additionalapplications as long as they are served on a different port thanJupyterLab itself.
Once an additional application is started, you can access it at thefollowing URI:
https://<DominoURL>/<dominoUsername>/<projectName>/notebookSession/<workspaceId>/proxy/<port>/
Workspaces 1 3 2 X 2
Suppose your JupyterLab session is served at:
Workspaces 1 3 2 0
If you then use the JupyterLab terminal to start a Dash app on port8887 for debugging, you could open the Dash app at:
If you instead use the JupyterLab terminal to start a Bokeh app on port5006, you could open the Bokeh app at:
With this model you can host multiple applications on different ports and expose each through yourJupyerLab workspace.
Once your app is running, if you edit its source files in JupyterLab,when you restart the app in your browser the edits will take effect.
For environments that have VSCode installed within JupyterLab, it’spossible to start aVSCodesession from JupyterLab, and then start an App from VSCode. This willallow you to debug using VSCode.
Note that any new App or process you start and open in a separate tabwill not have the Domino Workspace UI, with options to stop, sync,commit, or manage your project files. To access this UI and manage yourchanges, you must open the main JupyterLab tab for your Workspacesession.