Apache Spark is a data processing framework for massive datasets that cannot be handled using traditional methods or manually. Spark consist of RDD (or Resilient Distributed Datasets) which is distributed list like abstraction layer over your traditional data structures so that operations could be executed on different nodes parallel to each other.
Earlier Spark used to depend on Akka as it used to rely on Akka toolkit to communicate between nodes. As of Spark 1.6 , Spark no longer depends on Akka.
Akka is an actor framework for the jvm. It's is based on erlang and supports actor based distributed concurrency. The Actor Model provides a higher level of abstraction for writing concurrent and distributed applications. It helps to developer to deals with explicit locking and thread management. Akka makes it easier to write correct concurrent and parallel application.
Major use cases :
If you want to know more about Spark, then do check out this awesome video tutorial: