Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Chetan Khatri
Hey Dev/ USer,

I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1344, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.

I tried below options, but failed:

val spark = sparkSession.builder().enableHiveSupport().getOrCreate()

spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")

Please help with alternate workaround !

Thanks
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Jörn Franke
Try sparksession.conf().set

On 28. Jul 2017, at 12:19, Chetan Khatri <[hidden email]> wrote:

Hey Dev/ USer,

I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1344, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.

I tried below options, but failed:

val spark = sparkSession.builder().enableHiveSupport().getOrCreate()

spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")

Please help with alternate workaround !

Thanks
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Chetan Khatri
Jorn, Both are same.

On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <[hidden email]> wrote:
Try sparksession.conf().set

On 28. Jul 2017, at 12:19, Chetan Khatri <[hidden email]> wrote:

Hey Dev/ USer,

I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1344, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.

I tried below options, but failed:

val spark = sparkSession.builder().enableHiveSupport().getOrCreate()

spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")

Please help with alternate workaround !

Thanks

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Chetan Khatri
I think it will be same, but let me try that 


On Fri, Jul 28, 2017 at 4:44 PM, ayan guha <[hidden email]> wrote:
Try running spark.sql("set yourconf=val")

On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri <[hidden email]> wrote:
Jorn, Both are same.

On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <[hidden email]> wrote:
Try sparksession.conf().set

On 28. Jul 2017, at 12:19, Chetan Khatri <[hidden email]> wrote:

Hey Dev/ USer,

I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1344, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.

I tried below options, but failed:

val spark = sparkSession.builder().enableHiveSupport().getOrCreate()

spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")

Please help with alternate workaround !

Thanks

--
Best Regards,
Ayan Guha

Loading...