spark 2.0 thrift服务器未在yarn模式下启动

um6iljoc  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(317)

我已经在本地环境中启动了spark-2.0thrift服务器,它运行得很好,当我在集群环境中尝试时,抛出了以下异常。

16/06/02 10:21:06 INFO spark.SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! 
It might have been killed or unable to launch application master.
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitFor
Application(YarnClientSchedulerBackend.scala:85)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start
(YarnClientSchedulerBackend.scala:62)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:148)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2246)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:749)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:57)
    at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:81)
    at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:724)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/06/02 10:21:06 INFO util.ShutdownHookManager: Shutdown hook called

签入应用程序主日志时

Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher

spark默认配置:

spark.executor.memory 2g
spark.driver.memory 4g
spark.executor.cores 1

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题