0 votes
1 view
in Big Data Hadoop & Spark by (6.5k points)
What is the specific function of Spark?

1 Answer

0 votes
by (11.3k points)

With the purpose of doing computations that are fast and efficient, Apache Spark was built. It is based on cluster technology with an extremely fast processing capability. The model is architectured around the Hadoop MapReduce technology which provides for high flexibility with regards to computational techniques. The queries could be stream processing or interactive. The striking feature of Spark is its in-memory cluster computing technology which successfully increases the speed of the software. The main task of Spark is to provide extensive functionality to reduce the number of tools used by conventional methods. 

If you want to learn Spark in Hadoop & crack the Hadoop Developer Certification (CCA175) exam then you can sign up for Intellipaat's Hadoop Online Training.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...