How to set hive.metastore.warehouse.dir in HiveContext?

tribbloid picture tribbloid · May 29, 2015 · Viewed 20.9k times · Source

I'm trying to write a unit test case that relies on DataFrame.saveAsTable() (since it is backed by a file system). I point the hive warehouse parameter to a local disk location:

sql.sql(s"SET hive.metastore.warehouse.dir=file:///home/myusername/hive/warehouse")

By default, Embedded Mode of metastore should be enabled, thus doesn't require an external database.

But HiveContext seems to be ignoring this configuration: since I still get this error when calling saveAsTable():

MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
    at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:619)
    at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:172)
    at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:224)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
    at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
    at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1071)
    at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1037)

This is quite annoying, why is it still happening and how to fix it?

Answer

David S. picture David S. · Oct 10, 2016

According to http://spark.apache.org/docs/latest/sql-programming-guide.html#sql

Note that the hive.metastore.warehouse.dir property in hive-site.xml is deprecated since Spark 2.0.0. Instead, use spark.sql.warehouse.dir to specify the default location of database in warehouse.