Explore Courses Blog Tutorials Interview Questions
0 votes
1 view
in Big Data Hadoop & Spark by (55.5k points)

Can anyone tell me which mode requires access to the Hadoop cluster and HDFS installation?

1 Answer

0 votes
by (119k points)

There are two execution modes in Apache Pig such as Map Reduce mode and local mode. Map Reduce mode is the default mode and requires access to the Hadoop cluster and HDFS installation. Map Reduce mode translates the queries into map-reduce jobs and is run on a Hadoop cluster. Map-reduce mode needs access to the Hadoop cluster to run these queries that are written in Pig Latin.

If you want to learn Hadoop, I would suggest this Big Data course by Intellipaat.

You can watch this video on Hadoop by Intellipaat for more details:

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

28.4k questions

29.7k answers


94.7k users

Browse Categories