0 votes
1 view
in Big Data Hadoop & Spark by (3k points)
What are the specific downfalls in Hadoop 1.0 that have rendered it obsolete today?

1 Answer

0 votes
by (6.6k points)

Hadoop 1.0 was groundbreaking in terms of entry to the field of big data analytics. But, there are certain flaws associated with it that had to be fixed in consecutive versions. The following are some of the key factors:

  • The namenode had low availability and no horizontal scalability.
  • The JobTracker had too many jobs to track and keep record of in big data operations and was always overburdened. 
  • Hadoop 1.0 had MRv1 and could only track Map and Reduce operations. 

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...