Back
Will Spark leave Hadoop behind?
One thing you must note before starting your Hadoop certification course is that Hadoop is essentially a parallel data processing software framework that is being used for map/reduce functions. However, Spark is aimed to run on the top of Apache Hadoop, which is primarily used for real-time processing. Note that Spark is an alternative to Hadoop (For MapReduce) not a replacement. To know more about the differences, go through this blog on Hadoop vs. Spark.
31k questions
32.8k answers
501 comments
693 users