我用spark scala(推荐系统)做过一个项目,我在执行my app时遇到了一个问题。通过spark submit,它显示了一个例外。我的目录是:projectfilms/src/main/scala/appfilms.scala我用sbt包编译了它,很成功,它创建了一个文件jar。
[info] Set current project to system of recommandation (in build file:/root/projectFilms/)
[info] Compiling 1 Scala source to /root/projectFilms/target/scala-2.11/classes...
[info] Packaging /root/projectFilms/target/scala-2.11/system-of-recommandation_2.11-1.0.jar ...
[info] Done packaging.
[success] Total time: 35 s, completed Nov 22, 2016 7:31:59 PM
但当我通过spark submit执行应用程序时:
spark-submit --class "AppFilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar
他显示了一个例外:
java.lang.ClassNotFoundException: AppFilms
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/22 19:32:33 INFO Utils: Shutdown hook called
请你回答我!
暂无答案!
目前还没有任何答案,快来回答吧!