0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I have a Spark application which using Spark 2.0 new API with SparkSession. I am building this application on top of another application which is using SparkContext. I would like to pass SparkContext to my application and initialize SparkSession using existing SparkContext.

However, I could not find a way how to do that. I found that SparkSession constructor with SparkContext is private so I can't initialize it in that way and builder does not offer any setSparkContext method. Do you think there exists some workaround?

1 Answer

0 votes
by (24.8k points)

I don’t really think if there is any way to initialize SparkSession from existing SparkContext as SparkSession's constructor is private. Instead, I would suggest you create an SQLContext using the SparkContext, and later get the sparksession from the sqlcontext like this:

val sqlContext=new SQLContext(sparkContext);

val spark=sqlContext.sparkSession

...