Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I am new to Spark.

While using pyspark, when I attempt to initialize a new SparkContext,

from pyspark import SparkContext
sc = SparkContext("local[4]", "test")


I get the following error:

ValueError: Cannot run multiple SparkContexts at once


I'm wondering if my previous attempts at running example code loaded something into memory that didn't clear out.

1 Answer

0 votes
by (32.3k points)

running ./bin/pyspark interactively automatically loads the sparkContext.

You will even see that written on the screen when you execute pyspark.

So, you can either run "del sc" or stop.sc() at the beginning and create a new sparkcontext:

→ stop.sc() → sc = SparkContext.getOrCreate()

 

or just carry on and use "sc" as automatically defined.

Browse Categories

...