Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I'm using HiveContext with SparkSQL and I'm trying to connect to a remote Hive metastore, the only way to set the hive metastore is through including the hive-site.xml on the classpath (or copying it to /etc/spark/conf/).

Is there a way to set this parameter programmatically in a java code without including the hive-site.xml ? If so what is the Spark configuration to use ?

1 Answer

0 votes
by (32.3k points)

For Spark 1.x, you can set with :

System.setProperty("hive.metastore.uris", "thrift://METASTORE:9083");

final SparkConf conf = new SparkConf();

SparkContext sc = new SparkContext(conf);

HiveContext hiveContext = new HiveContext(sc);

Or

final SparkConf conf = new SparkConf();

SparkContext sc = new SparkContext(conf);

HiveContext hiveContext = new HiveContext(sc);

hiveContext.setConf("hive.metastore.uris", "thrift://METASTORE:9083");

If your Hive is Kerberized :

Try setting these before creating the HiveContext :

System.setProperty("hive.metastore.sasl.enabled", "true");

System.setProperty("hive.security.authorization.enabled", "false");

System.setProperty("hive.metastore.kerberos.principal", hivePrincipal);

System.setProperty("hive.metastore.execute.setugi", "true");

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...