这是我的第一篇帖子!我一直在寻找解决这一问题的办法,但几周来都没有结果(断断续续)。。。
我有一个javahiveudf,我想在其中对配置单元表运行sql来Map内存中的数据以供以后使用。我有正确的连接信息,但它找不到hivedriver。我尝试过在配置单元中添加配置单元jdbc驱动程序jar,并将其导出到可运行的jar中。
下面有错误,后面有相关代码。
hive> add jar /tmp/xhUDFs.jar;
Added [/tmp/xhUDFs.jar] to class path
Added resources: [/tmp/xhUDFs.jar]
hive> CREATE TEMPORARY FUNCTION getRefdata as 'xhUDFs.GetRefdata';
OK
Time taken: 0.042 seconds
hive> select '06001',getRefdata('CBSAname','06001');
java.lang.classnotfoundexception:org.apache.hadoop.hive.jdbc.hivedriver位于java.net.urlclassloader.findclass(urlclassloader)。java:381)在java.lang.classloader.loadclass(classloader。java:424)在java.lang.classloader.loadclass(classloader。java:357)在java.lang.class.forname0(本机方法)在java.lang.class.forname(类。java:264)在xhudfs.getrefdata.getrefmap(getrefdata。java:53)在xhudfs.getrefdata.evaluate(getrefdata。java:35)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:497)在org.apache.hadoop.hive.ql.exec.functionregistry.invoke(functionregistry。java:954)在org.apache.hadoop.hive.ql.udf.generic.genericudfbridge.evaluate(genericudfbridge。java:182)在org.apache.hadoop.hive.ql.udf.generic.genericudf.initializeandfoldconstants(genericudf。java:168)在org.apache.hadoop.hive.ql.plan.exprnodegenericfuncdesc.newinstance(exprnodegenericfuncdesc。java:233)位于org.apache.hadoop.hive.ql.parse.typecheckprocfactory$defaultexprprocessor.getxpathorfunceprnodedesc(typecheckprocfactory)。java:1093)在org.apache.hadoop.hive.ql.parse.typecheckprocfactory$defaultexprprocessor.process(typecheckprocfactory)。java:1312)在org.apache.hadoop.hive.ql.lib.defaultruledispatcher.dispatch(defaultruledispatcher。java:90)在org.apache.hadoop.hive.ql.lib.defaultgraphwalker.dispatchandreturn(defaultgraphwalker。java:95)在org.apache.hadoop.hive.ql.lib.defaultgraphwalker.dispatch(defaultgraphwalker。java:79)在org.apache.hadoop.hive.ql.lib.defaultgraphwalker.walk(defaultgraphwalker。java:133)在org.apache.hadoop.hive.ql.lib.defaultgraphwalker.startwalking(defaultgraphwalker。java:110)在org.apache.hadoop.hive.ql.parse.typecheckprocfactory.genexprnode(typecheckprocfactory)。java:209)在org.apache.hadoop.hive.ql.parse.typecheckprocfactory.genexprnode(typecheckprocfactory。java:153)在org.apache.hadoop.hive.ql.parse.semanticanalyzer.genallexprnodedesc(semanticanalyzer.org.apache.hadoop.hive.ql.parse.semanticanalyzer.genalexprnodedesc.org。java:10499)在org.apache.hadoop.hive.ql.parse.semanticanalyzer.genexprnodedesc(语义分析器。java:10455)在org.apache.hadoop.hive.ql.parse.semanticalyzer.genselectplan(semanticalyzer。java:3822)在org.apache.hadoop.hive.ql.parse.semanticalyzer.genselectplan(semanticalyzer。java:3601)在org.apache.hadoop.hive.ql.parse.Semanticalyzer.genpostgroupbybodyplan(Semanticalyzer。java:8943)在org.apache.hadoop.hive.ql.parse.semanticalyzer.genbodyplan(semanticalyzer.org.apache.hadoop.hive.ql.parse.semanticalyzer.genbodyplan)上。java:8898)在org.apache.hadoop.hive.ql.parse.semanticanalyzer.genplan(semanticanalyzer。java:9743)在org.apache.hadoop.hive.ql.parse.semanticanalyzer.genplan(semanticanalyzer。java:9636)位于org.apache.hadoop.hive.ql.parse.semanticanalyzer.genoptree(semanticanalyzer。java:10109)在org.apache.hadoop.hive.ql.parse.calciteplanner.genoptree(calciteplanner。java:329)在org.apache.hadoop.hive.ql.parse.semanticanalyzer.analyzeinternal(semanticanalyzer。java:10120)在org.apache.hadoop.hive.ql.parse.calciteplanner.analyzeinternal(calciteplanner。java:211)在org.apache.hadoop.hive.ql.parse.baseSemanticalyzer.analyze(baseSemanticalyzer。java:227)在org.apache.hadoop.hive.ql.driver.compile(driver。java:454)在org.apache.hadoop.hive.ql.driver.compile(驱动程序。java:314)位于org.apache.hadoop.hive.ql.driver.compileinternal(驱动程序。java:1164)在org.apache.hadoop.hive.ql.driver.runinternal(driver。java:1212)在org.apache.hadoop.hive.ql.driver.run(driver。java:1101)在org.apache.hadoop.hive.ql.driver.run(driver。java:1091)在org.apache.hadoop.hive.cli.clidriver.processlocalcmd(clidriver。java:216)在org.apache.hadoop.hive.cli.clidriver.processcmd(clidriver。java:168)在org.apache.hadoop.hive.cli.clidriver.processline(clidriver。java:379)在org.apache.hadoop.hive.cli.clidriver.executedriver(clidriver。java:739)在org.apache.hadoop.hive.cli.clidriver.run(clidriver。java:684)位于org.apache.hadoop.hive.cli.clidriver.main(clidriver。java:624)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:497)在org.apache.hadoop.util.runjar.run(runjar。java:221)在org.apache.hadoop.util.runjar.main(runjar。java:136)java.lang.classnotfoundexception:org.apache.hadoop.hive.jdbc.hivedriver
现在代码片段:
try {
Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");
Connection wConn = DriverManager.getConnection("jdbc:hive2://172.20.2.40:10000/refdata", "", "");
Statement sql = wConn.createStatement();
ResultSet wResult = sql.executeQuery("select distinct zipcode5,cbsa_name,msa_name,pmsa_name,region,csaname,cbsa_div_name from refdata.zip_code_business_all_16_1_2016_orc where primaryrecord='P'");
while (wResult.next()){
censusMap.put(wResult.getString("zipcode5"), wResult.getString("cbsa_name"));
}
} catch (SQLException | ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
暂无答案!
目前还没有任何答案,快来回答吧!