livyserver nosuchmethoderror:httpmessages$sessioninfo

g6ll5ycj  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(216)

系统安装有以下版本:
带scala 2.10.6的spark 2.0.0
hadoop 2.7.3版
livy server 0.3.0 cloudera和scala 2.10.4
openjdk 1.7.0\ 95
使用maven命令,livy编译是正确的:

mvn -Pspark-2.0 package -DskipTests output:

编译的输出是成功的,如下所示。

[INFO] Reactor Summary:
[INFO]
[INFO] livy-main .......................................... SUCCESS [  1.236 s]
[INFO] livy-api ........................................... SUCCESS [  1.017 s]
[INFO] livy-client-common ................................. SUCCESS [  0.115 s]
[INFO] livy-test-lib ...................................... SUCCESS [  0.192 s]
[INFO] livy-rsc ........................................... SUCCESS [  0.819 s]
[INFO] livy-core_2.10 ..................................... SUCCESS [  0.175 s]
[INFO] livy-repl_2.10 ..................................... SUCCESS [  1.013 s]
[INFO] livy-core_2.11 ..................................... SUCCESS [ 10.379 s]
[INFO] livy-repl_2.11 ..................................... SUCCESS [  7.051 s]
[INFO] livy-server ........................................ SUCCESS [  5.957 s]
[INFO] livy-assembly ...................................... SUCCESS [  3.448 s]
[INFO] livy-client-http ................................... SUCCESS [  2.346 s]
[INFO] livy-scala-api_2.10 ................................ SUCCESS [  1.308 s]
[INFO] livy-scala-api_2.11 ................................ SUCCESS [  1.987 s]
[INFO] minicluster-dependencies_2.10 ...................... SUCCESS [  0.082 s]
[INFO] minicluster-dependencies_2.11 ...................... SUCCESS [  0.068 s]
[INFO] livy-integration-test .............................. SUCCESS [  2.355 s]
[INFO] livy-coverage-report ............................... SUCCESS [  0.150 s]
[INFO] livy-examples ...................................... SUCCESS [  0.573 s]
[INFO] livy-python-api .................................... SUCCESS [  0.264 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 41.054 s
[INFO] Finished at: 2016-11-29T12:22:28+01:00
[INFO] Final Memory: 98M/1386M
[INFO] ------------------------------------------------------------------------

livy.conf被设置为以Yarn簇模式运行。修改后的参数只有以下几个:

livy.server.host = master       #master is set in /etc/hosts to 192.168.0.32
 livy.server.port = 8998
 livy.spark.master = yarn
 livy.spark.deployMode = cluster

尝试按以下顺序测试服务器时

curl -X POST  --data '{"kind": "scala"}' -H "Content-Type: application/json" master:8998/sessions -d '{"code":"val a= 100;val b=a+1100;b"}'

我在执行过程中没有发现任何错误,但是命令执行后的livyserver日志显示如下错误:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/livysrc/server/target/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/livysrc/server/target/jars/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/11/29 13:34:47 INFO StateStore$: Using BlackholeStateStore for recovery.
16/11/29 13:34:47 INFO BatchSessionManager: Recovered 0 batch sessions. Next session id: 0
16/11/29 13:34:47 INFO InteractiveSessionManager: Recovered 0 interactive sessions. Next session id: 0
16/11/29 13:34:47 INFO RMProxy: Connecting to ResourceManager at /192.168.0.32:8032
16/11/29 13:34:47 WARN RequestLogHandler: !RequestLog
16/11/29 13:34:47 INFO WebServer: Starting server on http://master:8998
16/11/29 13:34:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/29 13:34:50 INFO InteractiveSession$: Creating LivyClient for sessionId: 0
16/11/29 13:34:50 WARN RSCConf: Your hostname, pcitbumint, resolves to a loopback address; using 192.168.199.1  instead (on interface vmnet8)
16/11/29 13:34:50 WARN RSCConf: Set 'livy.rsc.rpc.server.address' if you need to bind to another address.
16/11/29 13:34:50 INFO InteractiveSessionManager: Registering new session 0
16/11/29 13:34:51 INFO ContextLauncher: 16/11/29 13:34:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/29 13:34:51 INFO ContextLauncher: 16/11/29 13:34:51 INFO client.RMProxy: Connecting to ResourceManager at /192.168.0.32:8032
16/11/29 13:34:51 INFO ContextLauncher: 16/11/29 13:34:51 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 INFO yarn.Client: Setting up container launch context for our AM
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 INFO yarn.Client: Setting up the launch environment for our AM container
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 INFO yarn.Client: Preparing resources for our AM container
16/11/29 13:34:52 INFO ContextLauncher: 16/11/29 13:34:52 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
16/11/29 13:34:53 INFO ContextLauncher: 16/11/29 13:34:53 INFO yarn.Client: Uploading resource file:/tmp/spark-7e7077b1-0763-484a-a911-206de849a24c/__spark_libs__2481895686182315148.zip -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/__spark_libs__2481895686182315148.zip
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scala-reflect-2.10.4.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scala-reflect-2.10.4.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/spark-tags_2.11-2.0.1.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/spark-tags_2.11-2.0.1.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/livy-client-http-0.2.0.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/livy-client-http-0.2.0.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/netty-all-4.0.29.Final.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/netty-all-4.0.29.Final.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scalatest_2.11-2.2.6.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scalatest_2.11-2.2.6.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scala-xml_2.11-1.0.2.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scala-xml_2.11-1.0.2.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scala-library-2.10.6.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scala-library-2.10.6.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scala-reflect-2.10.6.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scala-reflect-2.10.6.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/scala-library-2.10.4.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/scala-library-2.10.4.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/livy-api-0.3.0-SNAPSHOT.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/livy-api-0.3.0-SNAPSHOT.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/rsc/target/jars/livy-rsc-0.3.0-SNAPSHOT.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/livy-rsc-0.3.0-SNAPSHOT.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/repl/scala-2.11/target/jars/livy-repl_2.11-0.3.0-SNAPSHOT.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/livy-repl_2.11-0.3.0-SNAPSHOT.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 WARN yarn.Client: Same name resource file:/usr/livysrc/repl/scala-2.11/target/jars/livy-client-http-0.2.0.jar added multiple times to distributed cache
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/repl/scala-2.11/target/jars/commons-codec-1.9.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/commons-codec-1.9.jar
16/11/29 13:34:54 INFO ContextLauncher: 16/11/29 13:34:54 INFO yarn.Client: Uploading resource file:/usr/livysrc/repl/scala-2.11/target/jars/livy-core_2.11-0.3.0-SNAPSHOT.jar -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/livy-core_2.11-0.3.0-SNAPSHOT.jar
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 WARN yarn.Client: Same name resource file:/usr/livysrc/repl/scala-2.11/target/jars/livy-api-0.3.0-SNAPSHOT.jar added multiple times to distributed cache
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO yarn.Client: Uploading resource file:/tmp/spark-7e7077b1-0763-484a-a911-206de849a24c/__spark_conf__4729152892300856418.zip -> hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0020/__spark_conf__.zip
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO spark.SecurityManager: Changing view acls to: pcitbu
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO spark.SecurityManager: Changing modify acls to: pcitbu
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO spark.SecurityManager: Changing view acls groups to: 
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO spark.SecurityManager: Changing modify acls groups to: 
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(pcitbu); groups with view permissions: Set(); users  with modify permissions: Set(pcitbu); groups with modify permissions: Set()
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO yarn.Client: Submitting application application_1480415886078_0020 to ResourceManager
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO impl.YarnClientImpl: Submitted application application_1480415886078_0020
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO yarn.Client: Application report for application_1480415886078_0020 (state: ACCEPTED)
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO yarn.Client: 
16/11/29 13:34:55 INFO ContextLauncher:      client token: N/A
16/11/29 13:34:55 INFO ContextLauncher:      diagnostics: N/A
16/11/29 13:34:55 INFO ContextLauncher:      ApplicationMaster host: N/A
16/11/29 13:34:55 INFO ContextLauncher:      ApplicationMaster RPC port: -1
16/11/29 13:34:55 INFO ContextLauncher:      queue: root.pcitbu
16/11/29 13:34:55 INFO ContextLauncher:      start time: 1480422895192
16/11/29 13:34:55 INFO ContextLauncher:      final status: UNDEFINED
16/11/29 13:34:55 INFO ContextLauncher:      tracking URL: http://master:8088/proxy/application_1480415886078_0020/
16/11/29 13:34:55 INFO ContextLauncher:      user: pcitbu
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO util.ShutdownHookManager: Shutdown hook called
16/11/29 13:34:55 INFO ContextLauncher: 16/11/29 13:34:55 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7e7077b1-0763-484a-a911-206de849a24c
16/11/29 13:34:58 WARN RSCClient: Client RPC channel closed unexpectedly.
16/11/29 13:34:58 WARN RSCClient: Error stopping RPC.
io.netty.util.concurrent.BlockingOperationException: DefaultChannelPromise@519b5cd2(uncancellable)
    at io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:390)
    at io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157)
    at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:251)
    at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129)
    at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28)
    at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:218)
    at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117)
    at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28)
    at com.cloudera.livy.rsc.rpc.Rpc.close(Rpc.java:305)
    at com.cloudera.livy.rsc.RSCClient.stop(RSCClient.java:221)
    at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:118)
    at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:112)
    at com.cloudera.livy.rsc.Utils$2.operationComplete(Utils.java:108)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567)
    at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:406)
    at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)
    at io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:956)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:608)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:586)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)
16/11/29 13:34:58 INFO RSCClient: Failing pending job 377c83db-9876-4d22-9c2c-2562153c65dc due to shutdown.
16/11/29 13:34:58 INFO InteractiveSession: Stopping InteractiveSession 0...
16/11/29 13:34:58 WARN RpcDispatcher: [ClientProtocol] Closing RPC channel with 1 outstanding RPCs.
16/11/29 13:34:58 INFO InteractiveSession: Stopped InteractiveSession 0.
java.util.concurrent.CancellationException

Yarn标准记录还显示:

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
16/11/29 13:07:17 INFO util.SignalUtils: Registered signal handler for TERM
16/11/29 13:07:17 INFO util.SignalUtils: Registered signal handler for HUP
16/11/29 13:07:17 INFO util.SignalUtils: Registered signal handler for INT
16/11/29 13:07:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/29 13:07:17 INFO yarn.ApplicationMaster: Preparing Local resources
16/11/29 13:07:18 INFO yarn.ApplicationMaster: Prepared Local resources Map(spark-tags_2.11-2.0.1.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/spark-tags_2.11-2.0.1.jar" } size: 15305 timestamp: 1480421234168 type: FILE visibility: PRIVATE, livy-client-http-0.2.0.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/livy-client-http-0.2.0.jar" } size: 3201057 timestamp: 1480421234217 type: FILE visibility: PRIVATE, livy-core_2.11-0.3.0-SNAPSHOT.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/livy-core_2.11-0.3.0-SNAPSHOT.jar" } size: 88650 timestamp: 1480421234969 type: FILE visibility: PRIVATE, scala-reflect-2.10.6.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scala-reflect-2.10.6.jar" } size: 3206180 timestamp: 1480421234535 type: FILE visibility: PRIVATE, __spark_libs__ -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/__spark_libs__3457234705671129701.zip" } size: 156594226 timestamp: 1480421234050 type: ARCHIVE visibility: PRIVATE, livy-repl_2.11-0.3.0-SNAPSHOT.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/livy-repl_2.11-0.3.0-SNAPSHOT.jar" } size: 933654 timestamp: 1480421234847 type: FILE visibility: PRIVATE, scala-library-2.10.4.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scala-library-2.10.4.jar" } size: 7126372 timestamp: 1480421234651 type: FILE visibility: PRIVATE, netty-all-4.0.29.Final.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/netty-all-4.0.29.Final.jar" } size: 2054931 timestamp: 1480421234273 type: FILE visibility: PRIVATE, livy-rsc-0.3.0-SNAPSHOT.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/livy-rsc-0.3.0-SNAPSHOT.jar" } size: 481989 timestamp: 1480421234777 type: FILE visibility: PRIVATE, scala-reflect-2.10.4.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scala-reflect-2.10.4.jar" } size: 3203471 timestamp: 1480421234122 type: FILE visibility: PRIVATE, __spark_conf__ -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/__spark_conf__.zip" } size: 92857 timestamp: 1480421235040 type: ARCHIVE visibility: PRIVATE, scalatest_2.11-2.2.6.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scalatest_2.11-2.2.6.jar" } size: 7439504 timestamp: 1480421234361 type: FILE visibility: PRIVATE, livy-api-0.3.0-SNAPSHOT.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/livy-api-0.3.0-SNAPSHOT.jar" } size: 9079 timestamp: 1480421234707 type: FILE visibility: PRIVATE, commons-codec-1.9.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/commons-codec-1.9.jar" } size: 263965 timestamp: 1480421234916 type: FILE visibility: PRIVATE, scala-library-2.10.6.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scala-library-2.10.6.jar" } size: 7132001 timestamp: 1480421234457 type: FILE visibility: PRIVATE, scala-xml_2.11-1.0.2.jar -> resource { scheme: "hdfs" host: "192.168.0.32" port: 9000 file: "/user/pcitbu/.sparkStaging/application_1480415886078_0011/scala-xml_2.11-1.0.2.jar" } size: 648678 timestamp: 1480421234402 type: FILE visibility: PRIVATE)
16/11/29 13:07:18 INFO yarn.ApplicationMaster: ApplicationAttemptId: appattempt_1480415886078_0011_000001
16/11/29 13:07:18 INFO spark.SecurityManager: Changing view acls to: pcitbu
16/11/29 13:07:18 INFO spark.SecurityManager: Changing modify acls to: pcitbu
16/11/29 13:07:18 INFO spark.SecurityManager: Changing view acls groups to: 
16/11/29 13:07:18 INFO spark.SecurityManager: Changing modify acls groups to: 
16/11/29 13:07:18 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(pcitbu); groups with view permissions: Set(); users  with modify permissions: Set(pcitbu); groups with modify permissions: Set()
16/11/29 13:07:18 INFO yarn.ApplicationMaster: Starting the user application in a separate Thread
16/11/29 13:07:18 INFO yarn.ApplicationMaster: Waiting for spark context initialization
16/11/29 13:07:18 INFO yarn.ApplicationMaster: Waiting for spark context initialization ... 
16/11/29 13:07:18 INFO driver.RSCDriver: Connecting to: 192.168.199.1:41972
16/11/29 13:07:18 INFO driver.RSCDriver: Starting RPC server...
16/11/29 13:07:18 WARN rsc.RSCConf: Your hostname, pcitbumint, resolves to a loopback address; using 192.168.199.1  instead (on interface vmnet8)
16/11/29 13:07:18 WARN rsc.RSCConf: Set 'livy.rsc.rpc.server.address' if you need to bind to another address.
16/11/29 13:07:18 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V @7: invokevirtual
  Reason:
    Type 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not assignable to 'scala/tools/nsc/interpreter/ILoop'
  Current Frame:
    bci: @7
    flags: { }
    locals: { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String', 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }
    stack: { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }
  Bytecode:
    0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b
    0x0000010: 2c2d 1904 b701 63b6 0166 57b1          

java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V @7: invokevirtual
  Reason:
    Type 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not assignable to 'scala/tools/nsc/interpreter/ILoop'
  Current Frame:
    bci: @7
    flags: { }
    locals: { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String', 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }
    stack: { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }
  Bytecode:
    0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b
    0x0000010: 2c2d 1904 b701 63b6 0166 57b1          

    at com.cloudera.livy.repl.ReplDriver.initializeContext(ReplDriver.scala:49)
    at com.cloudera.livy.rsc.driver.RSCDriver.run(RSCDriver.java:315)
    at com.cloudera.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:86)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
16/11/29 13:07:18 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V @7: invokevirtual
  Reason:
    Type 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not assignable to 'scala/tools/nsc/interpreter/ILoop'
  Current Frame:
    bci: @7
    flags: { }
    locals: { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String', 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }
    stack: { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }
  Bytecode:
    0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b
    0x0000010: 2c2d 1904 b701 63b6 0166 57b1          
)
16/11/29 13:07:28 ERROR yarn.ApplicationMaster: SparkContext did not initialize after waiting for 100000 ms. Please check earlier log output for errors. Failing the application.
16/11/29 13:07:28 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    com/cloudera/livy/repl/SparkInterpreter.bind(Ljava/lang/String;Ljava/lang/String;Ljava/lang/Object;Lscala/collection/immutable/List;)V @7: invokevirtual
  Reason:
    Type 'org/apache/spark/repl/SparkILoop' (current frame, stack[1]) is not assignable to 'scala/tools/nsc/interpreter/ILoop'
  Current Frame:
    bci: @7
    flags: { }
    locals: { 'com/cloudera/livy/repl/SparkInterpreter', 'java/lang/String', 'java/lang/String', 'java/lang/Object', 'scala/collection/immutable/List' }
    stack: { 'scala/tools/nsc/interpreter/ILoop$', 'org/apache/spark/repl/SparkILoop' }
  Bytecode:
    0x0000000: b201 052a b600 52b6 0109 bb00 2359 2a2b
    0x0000010: 2c2d 1904 b701 63b6 0166 57b1          
)
16/11/29 13:07:28 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://192.168.0.32:9000/user/pcitbu/.sparkStaging/application_1480415886078_0011
16/11/29 13:07:28 INFO util.ShutdownHookManager: Shutdown hook called

有人能帮我解决这个错误吗?
谢谢。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题