Back
Is it possible to use a local compute for the TensorFlow estimator? Provisioning a virtual machine for a training run takes an enormous amount of time, and I would like to be able to try a few runs locally until my configuration is stable.
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/service/how-to-train-tensorflow.md
It is possible to do this with ScriptRunConfig by creating an empty RunConfiguration. The documentation claims that it is possible to create a local ComputeTarget, but the documentation on how to do this is missing:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/service/how-to-set-up-training-targets.md#local
Local computer
Create and attach: There's no need to create or attach a compute target to use your local computer as the training environment.
Configure: When you use your local computer as a compute target, the training code is run in your development environment. If that environment already has the Python packages you need, use the user-managed environment.
[!code-python]
Create an empty RunConfiguration and write this code:
from azureml.core.runconfig import RunConfiguration# Edit a run configuration property on the fly.run_local = RunConfiguration()run_local.environment.python.user_managed_dependencies = True
from azureml.core.runconfig import RunConfiguration
# Edit a run configuration property on the fly.
run_local = RunConfiguration()
run_local.environment.python.user_managed_dependencies = True
31k questions
32.8k answers
501 comments
693 users