我正在用sparkr做一些金融方面的分析,当我试图得到sparkr的会话时,我得到以下错误,我不知道如何解决这个问题,谢谢你。
sparkr.session()17/11/24 19:23:29错误r.rbackendhandler:org.apache.spark.sql.api.r.sqlutils上的getorcreatesparksession在java.base/jdk.internal.reflect.nativemethodaccessorimpl.invoke0(本机方法)处的java.lang.reflect.invocationtargetexception失败java.base/jdk.internal.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl。java:62)在java.base/jdk.internal.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.base/java.lang.reflect.method.invoke(method。java:564)在org.apache.spark.api.r.rbackendhandler.handlemethodcall(rbackendhandler。scala:167)在org.apache.spark.api.r.rbackendhandler.channelread0(rbackendhandler。scala:108)在org.apache.spark.api.r.rbackendhandler.channelread0(rbackendhandler。scala:40)在io.netty.channel.simplechannelinboundhandler.channelread(simplechannelinboundhandler。java:105)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:357)在
io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:343)在io.netty.channel.abstractchannelhandlercontext.firechannelread(abstractchannelhandlercontext。java:336)在io.netty.handler.timeout.idlestatehandler.channelread(idlestatehandler。java:287)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:357)在
io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:343)在io.netty.channel.abstractchannelhandlercontext.firechannelread(abstractchannelhandlercontext。java:336)在io.netty.handler.codec.messagetomessagedecoder.channelread(messagetomessagedecoder。java:102)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:357)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:343)在io.netty.channel.abstractchannelhandlercontext.firechannelread(abstractchannelhandlercontext。java:336)在io.netty.handler.codec.bytetomessagedecoder.firechannelread(bytetomessagedecoder。java:293)在io.netty.handler.codec.bytetomessagedecoder.channelread(bytetomessagedecoder。java:267)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:357)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:343)在io.netty.channel.abstractchannelhandlercontext.firechannelread(abstractchannelhandlercontext。java:336)在io.netty.channel.defaultchannelpipeline$headcontext.channelread(defaultchannelpipeline。java:1294)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:357)在io.netty.channel.abstractchannelhandlercontext.invokechannelread(abstractchannelhandlercontext。java:343)在io.netty.channel.defaultchannelpipeline.firechannelread(defaultchannelpipeline。java:911)在io.netty.channel.nio.abstractniobytechannel$niobytueunsafe.read(abstractniobytechannel。java:131)在io.netty.channel.nio.nioeventloop.processselectedkey(nioeventloop。java:643)在io.netty.channel.nio.nioeventloop.processselectedkeysoptimized(nioeventloop。java:566)在io.netty.channel.nio.nioeventloop.processselectedkeys(nioeventloop。java:480)在io.netty.channel.nio.nioeventloop.run(nioeventloop。java:442)在io.netty.util.concurrent.singlethreadeventexecutor$2.run(singlethreadeventexecutor。java:131)在io.netty.util.concurrent.defaultthreadfactory$DefaultRunnabledeCocator.run(defaultthreadfactory)。java:144)在java.base/java.lang.thread.run(thread。java:844)原因:java.lang.illegalargumentexception:示例化“org.apache.spark.sql.hive.hivesessionstatebuilder”时出错:位于org.apache.spark.sql.sparksession$.org$apache$spark$sql$sparksession$$instanceSessionState(sparksession)。scala:1053)在org.apache.spark.sql.sparksession$$anonfun$sessionstate$2.apply(sparksession。scala:130)在org.apache.spark.sql.sparksession$$anonfun$sessionstate$2.apply(sparksession。scala:130)在scala.option.getorelse(option。scala:121)位于org.apache.spark.sql.sparksession.sessionstate$lzycompute(sparksession.com)。scala:129)在org.apache.spark.sql.sparksession.sessionstate(sparksession。scala:126)在org.apache.spark.sql.api.r.sqlutils$$anonfun$setsparkcontextsessionconf$2.apply(sqlutils)。scala:71)在org.apache.spark.sql.api.r.sqlutils$$anonfun$setsparkcontextsessionconf$2.apply(sqlutils。scala:70)在scala.collection.traversablelike$withfilter$$anonfun$foreach$1.apply(traversablelike。scala:733)在scala.collection.iterator$class.foreach(迭代器。scala:893)在scala.collection.abstractiterator.foreach(迭代器。scala:1336)在scala.collection.iterablelike$class.foreach(iterablelike。scala:72)在scala.collection.abstractiterable.foreach(iterable。scala:54)在scala.collection.traversablelike$withfilter.foreach(traversablelike。scala:732)在org.apache.spark.sql.api.r.sqlutils$.setsparkcontextsessionconf(sqlutils。scala:70)在org.apache.spark.sql.api.r.sqlutils$.getorcreatesparksession(sqlutils。scala:63)在org.apache.spark.sql.api.r.sqlutils.getorcreatesparksession(sqlutils.scala)上。。。36其他原因:java.lang.illegalargumentexception:无法找到要连接到元存储的配置单元jar。请设置spark.sql.hive.metastore.jars。在org.apache.spark.sql.hive.hiveutils$.newclientformetadata(hiveutils。scala:302)在org.apache.spark.sql.hive.hiveutils$.newclientformetadata(hiveutils。scala:266)位于org.apache.spark.sql.hive.hiveexternalcatalog.client$lzycompute(hiveexternalcatalog)。scala:66)在org.apache.spark.sql.hive.hiveexternalcatalog.client(hiveexternalcatalog。scala:65)在org.apache.spark.sql.hive.hiveexternalcatalog$$anonfun$databaseexists$1.apply$mcz$sp(hiveexternalcatalog)。scala:194)位于org.apache.spark.sql.hive.hiveexternalcatalog$$anonfun$databaseexists$1.apply(hiveexternalcatalog)。scala:194)位于org.apache.spark.sql.hive.hiveexternalcatalog$$anonfun$databaseexists$1.apply(hiveexternalcatalog)。scala:194)在org.apache.spark.sql.hive.hiveexternalcatalog.withclient(hiveexternalcatalog。scala:97)位于org.apache.spark.sql.hive.hiveexternalcatalog.databaseexists(hiveexternalcatalog)。scala:193)在org.apache.spark.sql.internal.sharedstate.externalcatalog$lzycompute(sharedstate。scala:105)在org.apache.spark.sql.internal.sharedstate.externalcatalog(共享状态。scala:93)位于org.apache.spark.sql.hive.hivesessionstatebuilder.externalcatalog(hivesessionstatebuilder)。scala:39)位于org.apache.spark.sql.hive.hivesessionstatebuilder.catalog$lzycompute(hivesessionstatebuilder)。scala:54)在org.apache.spark.sql.hive.hivesessionstatebuilder.catalog(hivesessionstatebuilder)。scala:52)位于org.apache.spark.sql.hive.hivesessionstatebuilder.catalog(hivesessionstatebuilder)。scala:35)在org.apache.spark.sql.internal.basesessionstatebuilder.build(basesessionstatebuilder。scala:289)在org.apache.spark.sql.sparksession$.org$apache$spark$sql$sparksession$$instantiatesessionstate(sparksession)。scala:1050) ... 52更多handleerrors错误(returnstatus,conn):
java.lang.illegalargumentexception:示例化“org.apache.spark.sql.hive.hivesessionstatebuilder”时出错:位于org.apache.spark.sql.sparksession$.org$apache$spark$sql$sparksession$$instanceSessionState(sparksession)。scala:1053)在org.apache.spark.sql.sparksession$$anonfun$sessionstate$2.apply(sparksession。scala:130)在org.apache.spark.sql.sparksession$$anonfun$sessionstate$2.apply(sparksession。scala:130)在scala.option.getorelse(option。scala:121)位于org.apache.spark.sql.sparksession.sessionstate$lzycompute(sparksession.com)。scala:129)在org.apache.spark.sql.sparksession.sessionstate(sparksession。scala:126)在org.apache.spark.sql.api.r.sqlutils$$anonfun$setsparkcontextsessionconf$2.apply(sqlutils)。scala:71)在org.apache.spark.sql.api.r.sqlutils$$anonfun$setsparkcontextsessionconf$2.apply(sqlutils。scala:70)在scala.collection.traversablelike$withfilter$$anonfun$foreach$1.apply(traversablelike。scala:733)在scala.collection.iterator$class.foreach(iterator.sca
暂无答案!
目前还没有任何答案,快来回答吧!