Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
3 views
in Big Data Hadoop & Spark by (11.4k points)

My map tasks need some configuration data, which I would like to distribute via the Distributed Cache.

The Hadoop MapReduce Tutorial shows the usage of the DistributedCache class, roughly as follows:

// In the driver

JobConf conf = new JobConf(getConf(), WordCount.class);

...

DistributedCache.addCacheFile(new Path(filename).toUri(), conf); 

// In the mapper

Path[] myCacheFiles = DistributedCache.getLocalCacheFiles(job);

...

However, DistributedCache is marked as deprecated in Hadoop 2.2.0.

What is the new preferred way to achieve this? Is there an up-to-date example or tutorial covering this API? 

1 Answer

0 votes
by (32.3k points)
edited by

The APIs for the Distributed Cache can be found in the Job class itself. The code should be something like

Job job = new Job();

...

job.addCacheFile(new Path(filename).toUri());

In your mapper code:

Path[] localPaths = context.getLocalCacheFiles();

...

Check the documentation here: http://hadoop.apache.org/docs/stable2/api/org/apache/hadoop/mapreduce/Job.html

If you want to know more about MapReduce, then do check out this awesome video tutorial:

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...