无法在oozie的sqoop工作流中加载hiveconf

5sxhfpxr  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(362)

我在oozie中创建了一个sqoop工作流,将mysql中的数据sqoop到hive表中。
如果我在终端中运行sqoop作业,它运行良好,数据成功插入到配置单元表中,但是如果我将作业放入oozie中并运行它,它会给我一个错误:

Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

有人能帮我吗?
我用的是hdp2.5。
下面是我的oozie属性文件:

nameNode=hdfs://master.nodes:8020
jobTracker=master.nodes:8050
queueName=default
examplesRoot=jas-oozie

oozie.use.system.libpath=true
oozie.libpath=${nameNode}/user/oozie/share/lib
oozie.action.sharelib.for.pig=hive,pig,hcatalog
oozie.action.sharelib.for.hive=pig,hcatalog,atlas,hive

oozie.wf.application.path=${nameNode}/user/${user.name}/${examplesRoot}/jas-oozie-workflow.xml
outputDir=jas

下面是oozie的xml文件:

<?xml version="1.0" encoding="UTF-8"?>

<workflow-app xmlns="uri:oozie:workflow:0.2" name="jas-import-wf">
    <start to="sqoop-import-air-quality-node"/>

    <action name="sqoop-import-air-quality-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/>
                <!-- <mkdir path="${nameNode}/user/${wf:user()}/${examplesRoot}/output-data"/> -->
            </prepare>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
            <arg>import</arg>
            <arg>--connect</arg>
            <arg>jdbc:mysql://xx.xx.xx.xx:3306/xxxx?dontTrackOpenResources=true&amp;defaultFetchSize=1000&amp;useCursorFetch=true&amp;zeroDateTimeBehavior=convertToNull</arg>
            <arg>--driver</arg>
            <arg>com.mysql.jdbc.Driver</arg>
            <arg>--username</arg>
            <arg>xx</arg>
            <arg>--password</arg>
            <arg>xx</arg>
            <arg>--query</arg>
            <arg>
                select <fields> from <table> where $CONDITIONS
            </arg>
            <arg>--hive-import</arg>
            <arg>--hive-drop-import-delims</arg>
            <arg>--hive-overwrite</arg>
            <arg>--hive-table</arg>
            <arg>table</arg>
            <arg>--target-dir</arg>
            <arg>/user/${wf:user()}/${examplesRoot}/output-data/sqoop-import</arg>
            <arg>-m</arg>
            <arg>1</arg>
        </sqoop>
        <ok to="end"/>
        <error to="import-air-quality-fail"/>
    </action>

    <kill name="import-air-quality-fail">
        <message>Sqoop from ICP failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>

    <end name="end"/>
</workflow-app>

谢谢您。

i1icjdpr

i1icjdpr1#

1) 创建目录 lib 在路径中 ${nameNode}/user/${user.name}/${examplesRoot}/ 2) 添加 hive-exec jar${nameNode}/user/${user.name}/${examplesRoot}/lib/ 路径,然后重试

相关问题