作为spark引擎执行配置单元查询时出现以下错误。
Error:
Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Hive Console:
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
hive> set spark.executor.memory=2g;
Hadoop - 2.7.0
Hive - 1.2.1
Spark - 1.6.1
1条答案
按热度按时间vfh0ocws1#
Yarn容器内存小于Spark执行器要求。我将容器内存和最大值设置为大于spark executor内存+开销。选中“yarn.scheduler.maximum allocation mb”和/或“yarn.nodemanager.resource.memory mb”。
请看这里的来源