Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (55.6k points)

Can anyone tell me which mode requires access to the Hadoop cluster and HDFS installation?

1 Answer

0 votes
by (119k points)

There are two execution modes in Apache Pig such as Map Reduce mode and local mode. Map Reduce mode is the default mode and requires access to the Hadoop cluster and HDFS installation. Map Reduce mode translates the queries into map-reduce jobs and is run on a Hadoop cluster. Map-reduce mode needs access to the Hadoop cluster to run these queries that are written in Pig Latin.

If you want to learn Hadoop, I would suggest this Big Data course by Intellipaat.

You can watch this video on Hadoop by Intellipaat for more details:

Browse Categories

...