0 votes
1 view
in Big Data Hadoop & Spark by (5.2k points)

Can anyone tell me how to check the Spark version in PySpark?

1 Answer

0 votes
by (11.3k points)

You can simply write the following command to know the current Spark version in PySpark, assuming the Spark Context variable to be 'sc':

sc.version

If you are looking for an online course to learn Spark, I recommend this Spark Course by Intellipaat.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...