Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I am trying to run SparkSQL :

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)


But the error i m getting is below:

        ... 125 more
Caused by: java.sql.SQLException: Another instance of Derby may have already booted the database /root/spark/bin/metastore_db.
        at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
        at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
        at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
        at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
        ... 122 more
Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /root/spark/bin/metastore_db.
        at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
        at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)

1 Answer

0 votes
by (32.3k points)

I was getting the same error while creating Data frames on Spark Shell :

Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /metastore_db.

Cause:

I found that this is happening as there were multiple other instances of Spark-Shell already running and holding derby DB already, so when i was starting yet another Spark Shell and creating Data Frame on it using RDD.toDF() it was throwing error:

Solution:

I ran the ps command to find other instances of Spark-Shell:

ps -ef | grep spark-shell

and i killed them all using kill command:

kill -9 Spark-Shell-processID ( example: kill -9 4848)

after all the Spark-Shell instances were gone, i started a new Spark-Shell and reran my Data frame function and it ran just fine.

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

30.5k questions

32.5k answers

500 comments

108k users

Browse Categories

...