Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (6.5k points)
What is the specific function of Spark?

1 Answer

0 votes
by (11.3k points)

With the purpose of doing computations that are fast and efficient, Apache Spark was built. It is based on cluster technology with an extremely fast processing capability. The model is architectured around the Hadoop MapReduce technology which provides for high flexibility with regards to computational techniques. The queries could be stream processing or interactive. The striking feature of Spark is its in-memory cluster computing technology which successfully increases the speed of the software. The main task of Spark is to provide extensive functionality to reduce the number of tools used by conventional methods. 

If you want to learn Spark in Hadoop & crack the Hadoop Developer Certification (CCA175) exam then you can sign up for Intellipaat's Hadoop Online Training.

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...