Explore Courses Blog Tutorials Interview Questions
0 votes
in Azure by (5.8k points)
I have just started to get myself acquainted with parallelism in R.

As I am planning to use Microsoft Azure Machine Learning Studio for my project, I have started investigating what Microsoft R Open offers for parallelism, and thus, I found this, in which it says that parallelism is done under the hood that leverages the benefit of all available cores, without changing the R code. The article also shows some performance benchmarks, however, most of them demonstrate the performance benefit in doing mathematical operations.

This was good so far. In addition, I am also interested to know whether it also parallelize the *apply functions under the hood or not. I also found these 2 articles that describe how to parallelize *apply functions in general:

A quick guide to parallel R with snow: describes facilitating parallelism using snow package, par*apply function family, and clusterExport.

A gentle introduction to parallel computing in R: using parallel package, par*apply function family, and binding values to the environment.

So my question is when I will be using *apply functions in Microsoft Azure Machine Learning Studio, will that be parallelized under the hood by default, or I need to make use of packages like parallel, snow, etc.?

1 Answer

0 votes
by (9.6k points)

MRO uses multiple threads to fit the model and which in turn gives the speedups. 

In MRO, Rblas is replaced with the one that has MKO (math kernel library). MKL effects only those codes that has lnear algebra. It won't specifically increase the size if you use packages but also, it won't downgrade the speed.

Browse Categories