无法使用oozie运行shell脚本

dsf9zpds  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(461)

嗨,我正在尝试通过oozie运行shell脚本。

org.apache.oozie.action.hadoop.ShellMain], exit code [1]

我的job.properties文件

nameNode=hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020
jobTracker=ip-172-31-41-199.us-west-2.compute.internal:8032
queueName=default
oozie.libpath=${nameNode}/user/oozie/share/lib/
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
oozieProjectRoot=shell_example
oozie.wf.application.path=${nameNode}/user/karun/${oozieProjectRoot}/apps/shell

我的工作流.xml

<workflow-app xmlns="uri:oozie:workflow:0.1" name="pi.R example">
<start to="shell-node"/>
<action name="shell-node">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${queueName}</value>
</property>
</configuration>
<exec>script.sh</exec>
<file>/user/karun/oozie-oozi/script.sh#script.sh</file>
<capture-output/>
</shell>
<ok to="end"/>
<error to="fail"/>
 </action>
 <kill name="fail">
 <message>Incorrect output</message>
</kill>
<end name="end"/>
</workflow-app>

我的shell脚本-script.sh

export SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/spark
export YARN_CONF_DIR=/etc/hadoop/conf
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
export HADOOP_CMD=/usr/bin/hadoop
/SparkR-pkg/lib/SparkR/sparkR-submit --master yarn-client examples/pi.R yarn-client 4

错误日志文件

WEBHCAT_DEFAULT_XML=/opt/cloudera/parcels/CDH-5.4.2- 1.cdh5.4.2.p0.2/etc/hive-webhcat/conf.dist/webhcat-default.xml:
CDH_KMS_HOME=/opt/cloudera/parcels/CDH-5.4.2-1.cdh5.4.2.p0.2/lib/hadoop-kms:
LANG=en_US.UTF-8:
HADOOP_MAPRED_HOME=/opt/cloudera/parcels/CDH-5.4.2-  1.cdh5.4.2.p0.2/lib/hadoop-mapreduce:

=================================================================
现在调用shell命令行>>

Stdoutput Running /opt/cloudera/parcels/CDH-5.4.2-  
1.cdh5.4.2.p0.2/lib/spark/bin/spark-submit --class  edu.berkeley.cs.amplab.sparkr.SparkRRunner --files hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020/user/karun/examples/pi.R --master yarn-client 
/SparkR-pkg/lib/SparkR/sparkr-assembly-0.1.jar hdfs://ip-172-31-41-199.us-west-  2.compute.internal:8020/user/karun/examples/pi.R yarn-client 4
Stdoutput Fatal error: cannot open file 'pi.R': No such file or directory
Exit code of the Shell command 2
<<< Invocation of Shell command completed <<<
<<< Invocation of Main class completed <<<
 Failing Oozie Launcher, Main class  [org.apache.oozie.action.hadoop.ShellMain], exit code [1]

 Oozie Launcher failed, finishing Hadoop job gracefully

 Oozie Launcher, uploading action data to HDFS sequence file: hdfs://ip-172-31-41-199.us-west-2.compute.internal:8020/user/karun/oozie-oozi/0000035-150722003725443-oozie-oozi-W/shell-node--shell/action-data.seq

 Oozie Launcher ends

我不知道如何解决这个问题。任何帮助都将不胜感激。

bqjvbblv

bqjvbblv1#

sparkR-submit  ...  examples/pi.R  ...

致命错误:无法打开文件“pi.r”:没有这样的文件或目录
消息非常明确:shell尝试从本地文件系统读取r脚本。但是本地的,实际上是什么???
oozie用Yarn来跑你的壳;所以Yarn在随机机器上分配一个容器。这是你必须把它放进脑子里的东西,这样它就变成了一种反射:一个oozie操作所需的所有资源(脚本、库、配置文件等等)都必须是
预先在hdfs中提供
在执行时下载,感谢 <file> oozie脚本中的说明
在当前工作目录中作为本地文件访问
就你而言:

<exec>script.sh</exec>
<file>/user/karun/oozie-oozi/script.sh</file>
<file>/user/karun/some/place/pi.R</file>

然后

sparkR-submit  ...  pi.R  ...

相关问题