Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (6.5k points)

Can anyone explain the Spark and what is it used for?

1 Answer

0 votes
by (11.3k points)

Apache Spark is an open-source framework that has been developed for cluster-based computing. With Spark, we can capitalize on implicit features, such as data parallelism and fault tolerance. In other words, Spark provides an easy interface to process huge amounts of data across large clusters in parallel with minimum data loss. The framework was originally developed at the University of California, Berkeley, and is currently being maintained by the Apache Software Foundation.

You can check out this Apache Spark Tutorial by Intellipaat for a detailed explanation.

Browse Categories

...