运行h2o、rsparkling、Sparkyr

uurv41yg  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(410)

我一直在试着运行spark 2.2, master=yarn 含h2o( rsparkling )但当我跑的时候 h2o_context(sc) 我得到一个例外:

Error: java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.getUserJars(Lorg/apache/spark/SparkConf;Z)Lscala/collection/Seq;
    at org.apache.spark.repl.h2o.H2OInterpreter.createSettings(H2OInterpreter.scala:66)
    at org.apache.spark.repl.h2o.BaseH2OInterpreter.initializeInterpreter(BaseH2OInterpreter.scala:101)
    at org.apache.spark.repl.h2o.BaseH2OInterpreter.<init>(BaseH2OInterpreter.scala:291)
    at org.apache.spark.repl.h2o.H2OInterpreter.<init>(H2OInterpreter.scala:42)
    at water.api.scalaInt.ScalaCodeHandler.createInterpreterInPool(ScalaCodeHandler.scala:100)
    at water.api.scalaInt.ScalaCodeHandler$$anonfun$initializeInterpreterPool$1.apply(ScalaCodeHandler.scala:94)
    at water.api.scalaInt.ScalaCodeHandler$$anonfun$initializeInterpreterPool$1.apply(ScalaCodeHandler.scala:93)
    at scala.collection.immutable.Range.foreach(Range.scala:160)
    at water.api.scalaInt.ScalaCodeHandler.initializeInterpreterPool(ScalaCodeHandler.scala:93)
    at water.api.scalaInt.ScalaCodeHandler.<init>(ScalaCodeHandler.scala:37)
    at water.api.scalaInt.ScalaCodeHandler$.registerEndpoints(ScalaCodeHandler.scala:132)
    at water.api.CoreRestAPI$.registerEndpoints(CoreRestAPI.scala:32)
    at water.api.RestAPIManager.register(RestAPIManager.scala:39)
    at water.api.RestAPIManager.registerAll(RestAPIManager.scala:31)
    at org.apache.spark.h2o.backends.internal.InternalH2OBackend.init(InternalH2OBackend.scala:117)
    at org.apache.spark.h2o.H2OContext.init(H2OContext.scala:121)
    at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:352)
    at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:387)
    at org.apache.spark.h2o.H2OContext.getOrCreate(H2OContext.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at sparklyr.Invoke$.invoke(invoke.scala:102)
    at sparklyr.StreamHandler$.handleMethodCall(stream.scala:97)
    at sparklyr.StreamHandler$.read(stream.scala:62)
    at sparklyr.BackendHandler.channelRead0(handler.scala:52)
    at sparklyr.BackendHandler.channelRead0(handler.scala:14)
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
    at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

我也尝试过spark 2.0.0(通过 sparklyr 功能: spark_install ). 再配上水和水。当我开始工作的时候它起作用了 master="local" ,但是当我设置 master="yarn" 类似地,我尝试了spark 1.6,它工作得很好(也很好) master=yarn ).
有什么想法吗?
这是我的密码:

library(sparklyr)
library(rsparkling)
library(h2o)

Sys.setenv(SPARK_HOME='/usr/hdp/2.6.3.0-235/spark2')

sc <- spark_connect(master = "yarn")
h2o_context(sc)

我尝试过使用(类似的)install.packages(“h2o”,type=“source”,repos=”安装各种不同版本的h2ohttp://h2o-release.s3.amazonaws.com/h2o/rel-tverberg/2/r)但每次错误都不会改变。

zpqajqem

zpqajqem1#

谢谢你的错误报告!它应该已经被固定在主分支的起泡水,这将是下一个版本的一部分(预计下周)。
谢谢,纳夫迪普

相关问题