“退出时使用exitcode:-1000,原因是:找不到nmprivate的任何有效本地目录…”

0vvn1miw  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(653)

我正在尝试用hadoop、yarn和acumulo运行mapreduce作业。
我得到以下输出,我无法追踪问题。看起来是一个Yarn问题,但我不知道它在寻找什么。我在$hadoop\u prefix/grid/hadoop/hdfs/yarn/logs位置有一个nmprivate文件夹。这是它说找不到的文件夹吗?

14/03/31 08:48:46 INFO mapreduce.Job: Job job_1395942264921_0023 failed with state FAILED due to: Application application_1395942264921_0023 failed 2 times due to AM Container for appattempt_1395
942264921_0023_000002 exited with  exitCode: -1000 due to: Could not find any valid local directory for nmPrivate/container_1395942264921_0023_02_000001.tokens
.Failing this attempt.. Failing the application.
pprl5pva

pprl5pva1#

在簇模式下测试Yarn上的Spark:

spark-submit --master yarn --deploy-mode cluster --class org.apache.spark.examples.SparkPi /usr/local/install/spark-2.2.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.2.0.jar 100

我也犯了同样的错误:

Application application_1532249549503_0007 failed 2 times due to AM Container for appattempt_1532249549503_0007_000002 exited with exitCode: -1000 Failing this attempt.Diagnostics: java.io.IOException: Resource file:/usr/local/install/spark-2.2.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.2.0.jar changed on src filesystem (expected 1531576498000, was 1531576511000

有一个建议可以解决这种错误,修改core-site.xml或hadoop的其他conf。
最后,我通过设置属性修复了错误 fs.defaultFS 在$hadoop\u home/etc/hadoop/core-site.xml

相关问题