spark-on-emr和作业(jar):

uqzxnwby  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(369)

因此,我正在(或试图)从aws上emr集群的主节点运行一个编译的(fatjar)spark/scala程序。我已经在dev环境中编译了jar,它与prod环境具有相同的依赖关系。我正在部署spark提交脚本:

SPARK_JAR=./spark/lib/spark-assembly-1.2.1-hadoop2.4.0.jar \
./spark-submit \
--deploy-mode cluster \
--verbose \
--master yarn-cluster \
--class sparkSQLProcessor \
--driver-memory 1g \
--executor-memory 1g \
--executor-cores 1 \
--num-executors 1 \
/home/hadoop/Spark-SQL-Job.jar args1 args2

我遇到的问题是我得到了这个配置问题:(或者我假设是)

Exception in thread "main" java.io.FileNotFoundException: File file:/home/hadoop/.versions/spark-1.2.1.a/bin/spark/lib/spark-assembly-1.2.1-hadoop2.4.0.jar does not exist
    at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:516)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:729)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:506)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:407)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
    at org.apache.spark.deploy.yarn.ClientBase$class.copyFileToRemote(ClientBase.scala:102)
    at org.apache.spark.deploy.yarn.Client.copyFileToRemote(Client.scala:35)
    at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$3.apply(ClientBase.scala:182)
    at org.apache.spark.deploy.yarn.ClientBase$$anonfun$prepareLocalResources$3.apply(ClientBase.scala:176)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.apache.spark.deploy.yarn.ClientBase$class.prepareLocalResources(ClientBase.scala:176)
    at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:35)
    at org.apache.spark.deploy.yarn.ClientBase$class.createContainerLaunchContext(ClientBase.scala:308)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:35)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:80)
    at org.apache.spark.deploy.yarn.ClientBase$class.run(ClientBase.scala:501)
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:35)
    at org.apache.spark.deploy.yarn.Client$.main(Client.scala:139)
    at org.apache.spark.deploy.yarn.Client.main(Client.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ma8fv8wu

ma8fv8wu1#

我一直在emr上运行spark作业,但从未遇到此错误。您是使用emr引导操作来安装spark还是使用较新的emr 4.0版本?
无论哪种方法,您都应该在不设置spark\u jar环境变量的情况下尝试。

相关问题