Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (45k points)

Could someone tell me what is “sc” in PySpark?

1 Answer

0 votes
by (99k points)

The “sc” is nothing but SparkContext in PySpark. Basically what it is that it is the entry point for any Spark functionality. And it gets initiated once a driver program of Spark application initializes. Wish to learn more about PySpark? Check out the PySpark tutorial, and if you wish to get certified then check out the PySpark certification course, from Intellipaat. If you are a novice in the field of PySpark, watch the following video on PySpark Tutorial for Beginners and get started.

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

Browse Categories

...