0 votes
1 view
in Big Data Hadoop & Spark by (36k points)

Can anyone tell me which mode requires access to Hadoop cluster and HDFS installation?

1 Answer

0 votes
by (86.8k points)

PIG script runs on the Hadoop Cluster. PIG scripts are reduced into MapReduce jobs and these need access to Hadoop cluster and HDFS installation to run on the Hadoop cluster. In short, Map Reduce Mode requires access to the Hadoop cluster to run the pig scripts in the Hadoop Cluster.

If you wish to learn Hadoop from Industry experts, you can sign up for this Big Data Course course by Intellipaat.

Also, watch this video on how HDFS works to understand in-detail:

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...