0 votes
1 view
in Big Data Hadoop & Spark by (38.9k points)

Could someone tell me what is “sc” in PySpark?

1 Answer

0 votes
by (88.4k points)

The “sc” is nothing but SparkContext in PySpark. Basically what it is that it is the entry point for any Spark functionality. And it gets initiated once a driver program of Spark application initializes. Wish to learn more about PySpark? Check out the PySpark tutorial, and if you wish to get certified then check out the PySpark certification course, from Intellipaat. If you are a novice in the field of PySpark, watch the following video on PySpark Tutorial for Beginners and get started.

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...