Back

Explore Courses Blog Tutorials Interview Questions
0 votes
1 view
in Big Data Hadoop & Spark by (6.5k points)

Can anyone tell me how to check the Spark version in PySpark?

1 Answer

0 votes
by (11.3k points)

You can simply write the following command to know the current Spark version in PySpark, assuming the Spark Context variable to be 'sc':

sc.version

If you are looking for an online course to learn Spark, I recommend this Spark Course by Intellipaat.

Browse Categories

...