Ignoring non-spark config property: hive.exec.dynamic.partition.mode

Peter Krauss picture Peter Krauss · Oct 30, 2019 · Viewed 10k times · Source

How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict?

I try (as suggested here)

  export SPARK_MAJOR_VERSION=2; spark-shell  --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf'

but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict"


PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8

NOTE

The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable"),

org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

Answer

mazaneicha picture mazaneicha · Oct 31, 2019

You can try using spark.hadoop.* prefix as suggested in Custom Spark Configuration section for version 2.3. Might work as well in 2.2 if it was just a doc bug :)

spark-shell \
  --conf "spark.hadoop.hive.exec.dynamic.partition=true" \
  --conf "spark.hadoop.hive.exec.dynamic.partition.mode=nonstrict" \
  ...