Back
Do I need to learn Hadoop to use Spark?
No, you don't have to learn Hadoop to learn Spark. You must note that Spark is an independent project and became popular after Hadoop 2.0. Apache Spark can run on HDFS while Hadoop is essentially used to write MapReduce jobs.
Enroll now in this professional Big Data Hadoop certification training and start your journey.
Here is a video tutorial that you can watch to learn more about spark:-
31k questions
32.8k answers
501 comments
693 users