执行pyspark作业时顶点失败

8yoxcaq7  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(533)

我正在使用spark submit命令执行一个pyspark作业。以前它是工作文件,我已经执行了10多次相同的作业。它只是一个简单的从csv文件到配置单元表的数据加载命令,只包含500条记录。当我执行同一个命令时,现在它显示顶点失败问题。
我正在使用下面的spark提交命令。

spark-submit --num-executors 3 --executor-cores 3 --executor-memory 20g  
    --jars /usr/hdp/3.1.0.0-78/hive_warehouse_connector/hive-warehouse-connector-assembly-1.0.0.3.1.0.0-78.jar   
    --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-1.0.0.3.1.0.0-78.zip main.py 
    /user/hive/source

    /user/hive/sourc>Source File Location

我收到下面的错误信息。

Error while processing statement: FAILED: Execution Error, 
    return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1,
    vertexId=vertex_1599711935259_0207_17_00, diagnostics=[Task failed, taskId=task_1599711935259_0207_17_00_000000, 
     diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : 
    Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:0,
  Vertex vertex_1500011935259_0207_17_00 [Map 1] killed/failed due to:OWN_TASK_FAILURE]Vertex killed, 
    vertexName=Reducer 2, vertexId=vertex_1500011935259_0207_17_00, 
    diagnostics=[Vertex received Kill while in RUNNING state., 
    Vertex did not succeed due to OTHER_VERTEX_FAILURE, failedTasks:0 killedTasks:1, 
   Vertex vertex_1500011935259_0207_17_00 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]
   DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1

有人能建议我如何解决这个错误吗。?

2nbm6dog

2nbm6dog1#

这个问题得到解决,我得到这是由于以下参数计算不匹配。

hive.tez.container.size
    tez.runtime.io.sort.mb=0.4*hive.tez.container.size
    tez.runtime.unordered.output.buffer.size-mb=0.1*hive.tez.container.size

相关问题