PySpark is a Python API for Apache Spark, while Spark is an open-source big data processing framework written in Scala. The main differences between PySpark and Spark are:
1. PySpark is written in Python, while Spark is written in Scala.
2. PySpark is easier to use as it has a more user-friendly interface, while Spark requires more expertise in programming.
3. PySpark can be slower than Spark because of the overhead introduced by the Python interpreter, while Spark can provide better performance due to its native Scala implementation.
4. PySpark has access to some, but not all, of Spark's libraries, while Spark has a rich set of libraries for data processing.
5. Spark has a larger community of users and contributors than PySpark.
If you are interested in learning more about it, then don’t miss checking out the below video tutorial on PySpark -