处理语句时出错:失败:执行错误,从org.apache.hadoop.hive.ql.exec.mr.mapredtask返回代码2(state=08s01,code=2)

yyhrrdl8  于 2021-06-27  发布在  Hive
关注(0)|答案(0)|浏览(450)

我成功地执行了一个配置单元作业,但从上一天开始,Map器作业完成后它给了我一个错误,下面是日志和查询:

INSERT INTO TABLE zong_dwh.TEMP_P_UFDR_imp6 
SELECT
  from_unixtime(begin_time+5*3600,'yyyy-MM-dd') AS Date1,
  from_unixtime(begin_time+5*3600,'HH') AS Hour1, 
  MSISDN AS MSISDN,
  A.prot_type AS Protocol, 
  B.protocol as Application,
  host AS Domain,
  D.browser_name AS browser_type, 
  cast (null as varchar(10)) as media_format, 
  C.ter_type_name_en as device_category, 
  C.ter_brand_name as device_brand, 
  rat as session_technology, 
  case 
    when rat=1 then Concat(mcc,mnc,lac,ci) 
    when rat=2  then Concat(mcc,mnc,lac,sac) 
    when rat=6 then concat(mcc,mnc,eci) 
  end AS Actual_Site_ID,
  sum(coalesce(L4_DW_THROUGHPUT,0)+coalesce(L4_UL_THROUGHPUT,0)) as total_data_volume,
  sum(coalesce(TCP_UL_RETRANS_WITHPL,0)/coalesce(TCP_DW_RETRANS_WITHPL,1)) AS retrans_rate, 
  sum(coalesce(DATATRANS_UL_DURATION,0) + coalesce(DATATRANS_DW_DURATION,0)) as duration, 
  count(sessionkey) as usage_quantity,
  round(sum(L4_DW_THROUGHPUT)/1024/1024,4)/sum(end_time*1000+end_time_msel-begin_time*1000-begin_time_msel) AS downlink_throughput,
  round(sum(L4_UL_THROUGHPUT)/1024/1024,4)/sum(end_time*1000+end_time_msel-begin_time*1000-begin_time_msel) as uplink_throughput 
from 
  ps.detail_ufdr_http_browsing_17923 A 
  INNER JOIN ps.dim_protocol B ON   B.protocol_id=A.prot_type 
  INNER JOIN ps.dim_terminal C on substr(A.imei,1,8)=C.tac 
  inner join ps.dim_browser_type D on A.browser_type=D.browser_type_id  
Group by
  from_unixtime(begin_time+5*3600,'yyyy-MM-dd'),
  from_unixtime(begin_time+5*3600,'HH'),MSISDN,
  prot_type,
  B.protocol,
  host,
  D.browser_name,
  cast (null as varchar(10)),
  C.ter_type_name_en,
  C.ter_brand_name,
  rat,
  case 
    when rat=1 then Concat(mcc,mnc,lac,ci)  
    when rat=2  then Concat(mcc,mnc,lac,sac) 
    when rat=6 then concat(mcc,mnc,eci)  
  end;

日志:
错误:java.lang.runtimeexception:org.apache.hadoop.hive.ql.metadata.hiveexception:处理行(tag=0){“key”:{“\u col0”:“2019-02-11”,“\u col1”:“05”,“\u col2”:“3002346407”,“\u col3”:146,“\u col4”:“,“\u col5”:null,“\u col6”:null,“\u col7”:“35538908”,“\u col8”:6,“\u col9”:“,”,“\u col10”,“\u col11”,“\u col12”:“0ed1102”},“value”:{“\u col0”:75013,“\u col1”:4.0,“\u col2”:2253648000,“\u col3”:5,“\u col4”:0}}位于org.apache.hadoop.hive.ql.exec.mr.execreducer.reduce(execreducer)。java:256)在org.apache.hadoop.mapred.reducetask.runoldreducer(reducetask。java:444)在org.apache.hadoop.mapred.reducetask.run(reducetask。java:392)在org.apache.hadoop.mapred.yarnchild$2.run(yarnchild。java:182)位于java.security.accesscontroller.doprivileged(本机方法)在javax.security.auth.subject.doas(subject。java:422)在org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation。java:1769)在org.apache.hadoop.mapred.yarnchild.main(yarnchild。java:176)原因:org.apache.hadoop.hive.ql.metadata.hiveexception:处理行时发生配置单元运行时错误(标记=0){“key”:{“\u col0”:“2019-02-11”,“\u col1”:“05”,“\u col2”:“3002346407”,“\u col3”:146,“\u col4”:“,“\u col5”:null,“\u col6”:null,“\u col7”:“35538908”,“\u col8”:6,“\u col9”:“,“\u col10”:“,”“\u col11”:“,“\u col12”:“0ed1102”},“value”:{“\u col0”:75013,“\u col1”:4.0,“\u col2”:2253648000,“\u col3”:5,“\u col4”:0}}位于org.apache.hadoop.hive.ql.exec.mr.execreducer.reduce(execreducer)。java:244) ... 7其他原因:org.apache.hadoop.hive.ql.metadata.hiveexception:无法执行方法public org.apache.hadoop.io.text org.apache.hadoop.hive.ql.udf.udfconv.evaluate(org.apache.hadoop.io.text,org.apache.hadoop.io.intwritable,org.apache.hadoop.hive.ql.udf上的。udfconv@2e2f720 org.apache.hadoop.hive.ql.udf.udfconv类的参数{:org.apache.hadoop.io.text,16:org.apache.hadoop.io.intwriteable,10:org.apache.hadoop.io.intwriteable},大小为3,位于org.apache.hadoop.hive.ql.exec.functionregistry.invoke(functionregistry)。java:1034)在org.apache.hadoop.hive.ql.udf.generic.genericudfbridge.evaluate(genericudfbridge。java:182)在org.apache.hadoop.hive.ql.exec.exprnodegenericfuncealuator.\u evaluate(exprnodegenericfuncealuator。java:193)在org.apache.hadoop.hive.ql.exec.exprnodeevaluator.evaluate(exprnodeevaluator)。java:77)位于org.apache.hadoop.hive.ql.exec.exprnodeevaluator.evaluate(exprnodeevaluator。java:65)在org.apache.hadoop.hive.ql.exec.selectoperator.process(selectoperator。java:104)在org.apache.hadoop.hive.ql.exec.operator.forward(operator。java:838)在org.apache.hadoop.hive.ql.exec.groupbyoperator.forward(groupbyoperator。java:1019)位于org.apache.hadoop.hive.ql.exec.groupbyoperator.processaggr(groupbyoperator。java:821)位于org.apache.hadoop.hive.ql.exec.groupbyoperator.processkey(groupbyoperator。java:695)在org.apache.hadoop.hive.ql.exec.groupbyoperator.process(groupbyoperator。java:761)在org.apache.hadoop.hive.ql.exec.mr.execreducer.reduce(execreducer。java:235) ... 7其他原因:sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)处的java.lang.reflect.invocationtargetexception at sun.reflect.nativemethodaccessorimpl.invoke(本机方法)at sun.reflect.nativemethodaccessorimpl.invoke(本机方法)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.hadoop.hive.ql.exec.functionregistry.invoke(函数注册表)。java:1010) ... 18其他原因:java.lang.arrayindexoutofboundsexception:0位于org.apache.hadoop.hive.ql.udf.udfconv.evaluate(udfconv)。java:160) ... 还有23个

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题