如何在mesos的spark cluster上使用intellij idea运行spark应用程序

syqv5f0l  于 2021-06-21  发布在  Mesos
关注(0)|答案(0)|浏览(245)

我在mesos上有一个星火团,在同一个主机上有一个主从。然后我可以使用spark submit执行官方spark示例,如下所示:

//bin/spark-submit --deploy-mode cluster --master mesos://<master_ip>:7077 --class org.apache.spark.examples.SparkPi /opt/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar

我也尝试用intellij的想法来构建应用程序。当我在locall机器上执行代码时:

import org.apache.spark.{SparkConf, SparkContext}

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf()
      .setAppName("Simple Application")
      .setMaster("local")
    val sc = new SparkContext(conf)
    ...
  }
}

本地运行正常,但更改为在spark mesos上运行时:

import org.apache.spark.{SparkConf, SparkContext}

object SimpleApp {
  def main(args: Array[String]) {
     val conf = new SparkConf()
       .setAppName("Simple Application")
       .setMaster("mesos://<master_ip>:7077")
     val sc = new SparkContext(conf)
     ...
   }
}

输出错误是:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/myusername/project/spark-01/lib/spark-assembly-1.4.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/myusername/project/spark-01/lib/spark-examples-1.4.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/18 16:29:53 INFO SparkContext: Running Spark version 1.4.0
15/08/18 16:29:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/18 16:29:54 INFO SecurityManager: Changing view acls to: myusername
15/08/18 16:29:54 INFO SecurityManager: Changing modify acls to: myusername
15/08/18 16:29:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(myusername); users with modify permissions: Set(myusername)
15/08/18 16:29:56 INFO Slf4jLogger: Slf4jLogger started
15/08/18 16:29:56 INFO Remoting: Starting remoting
15/08/18 16:29:56 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@172.23.10.21:37362]
15/08/18 16:29:56 INFO Utils: Successfully started service 'sparkDriver' on port 37362.
15/08/18 16:29:56 INFO SparkEnv: Registering MapOutputTracker
15/08/18 16:29:56 INFO SparkEnv: Registering BlockManagerMaster
15/08/18 16:29:57 INFO DiskBlockManager: Created local directory at /tmp/spark-29fef56b-0a26-4cd7-b391-2f436bca1c55/blockmgr-b7febe40-5d37-4862-be78-4b6f4df1738c
15/08/18 16:29:57 INFO MemoryStore: MemoryStore started with capacity 953.4 MB
15/08/18 16:29:57 INFO HttpFileServer: HTTP File server directory is /tmp/spark-29fef56b-0a26-4cd7-b391-2f436bca1c55/httpd-94618d51-782f-4262-a113-8d44bf0b29d7
15/08/18 16:29:57 INFO HttpServer: Starting HTTP Server
15/08/18 16:29:57 INFO Utils: Successfully started service 'HTTP file server' on port 59838.
15/08/18 16:29:57 INFO SparkEnv: Registering OutputCommitCoordinator
15/08/18 16:29:57 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/08/18 16:29:57 INFO SparkUI: Started SparkUI at http://172.23.10.21:4040
Failed to load native Mesos library from /home/myusername/current/idea.14/bin::/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
Exception in thread "main" java.lang.UnsatisfiedLinkError: no mesos in java.library.path
  at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
  at java.lang.Runtime.loadLibrary0(Runtime.java:849)
  at java.lang.System.loadLibrary(System.java:1088)
  at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
  at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
  at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2535)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
  at SimpleApp$.main(SimpleApp.scala:11)
  at SimpleApp.main(SimpleApp.scala)

我解决了导入所有mesos的问题。所以mesos的依赖性,但不是一个很好的解决方案,然后所有开发人员都需要了解mesos。所以
我研究的是:如何在intellij idea上运行spark应用程序,但所有示例都显示第一个场景在本地运行。
问题:
开发sapark应用程序是否有效?或者右边的流程是在本地开发算法,然后使用spark submit在mesos上运行?
有谁知道一个更好的方法来运行intellij idea的spark应用程序来运行spark mesos集群?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题