Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (50.2k points)

What is PySpark SparkContext?

1 Answer

0 votes
by (106k points)

SparkContext is the gateway to any spark-derived application or feature. This is the first thing we need to start when we run a Spark application. In PySpark, SparkContext has several parameters, including, Master, appName, SparkHome, pyFiles, Environment, BatchSize, etc.

Here is a video tutorial which you can watch to learn more about spark:-

Browse Categories