hive:尝试创建动态分区时出现致命错误

r1wp621o  于 2021-05-30  发布在  Hadoop
关注(0)|答案(1)|浏览(438)
create table MY_DATA0(session_id STRING, userid BIGINT,date_time STRING, ip STRING, URL STRING ,country STRING, state STRING, city STRING) 
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES 
TERMINATED BY '\n' STORED AS TEXTFILE ;

    LOAD DATA INPATH '/inputhive' OVERWRITE INTO TABLE MY_DATA0;

    create table part0(session_id STRING, userid BIGINT,date_time STRING, ip STRING, URL STRING) partitioned by (country STRING, state STRING, city STRING) 

    clustered by (userid) into 256 buckets ROW FORMAT DELIMITED FIELDS 
    TERMINATED BY ',' LINES TERMINATED BY '\n' STORED AS TEXTFILE ;

    \insert overwrite table part0 partition(country, state, city) select session_id, userid, date_time,ip, url, country, state,city from my_data0;

我的数据集概述:
{60a191cb-b3ca-496e-b33b-0aca551dd503},13315824872012-03-12 13:01:27,66.91.193.75,http://www.acme.com/sh55126545/vd55179433,美国,夏威夷
{365cc356-7822-8a42-51d2-b6396f8fc5bf},13315848352012-03-12 13:40:35173.172.214.24,http://www.acme.com/sh55126545/vd55179433,美国,德克萨斯州埃尔帕索
当我运行最后一个insert脚本时,我得到一个错误:
java.lang.runtimeexception:org.apache.hadoop.hive.ql.metadata.hivefatalexception:[error 20004]:节点尝试创建太多动态分区时发生致命错误。动态分区的最大数目由hive.exec.max.dynamic.partitions和hive.exec.max.dynamic.partitions.pernode控制。最大值设置为:100
附言:
我设置了两个属性:
hive.exec.dynamic.partition.mode::非严格
hive.enforce.bucketing::true

tvmytwxo

tvmytwxo1#

尝试将这些属性设置为更高的值。

SET hive.exec.max.dynamic.partitions=100000;
SET hive.exec.max.dynamic.partitions.pernode=100000;

相关问题