我正在尝试运行一个spark scala程序,其中我使用了“import” org.apache.spark.internal.Logging"
. 程序在本地运行良好,直到我尝试构建一个胖jar并添加 assembly.sbt
使用 build.sbt
这是我的构建.sbt
lazy val root = (project in file(".")).
settings(
name := "TestingSparkApplicationProject",
version := "0.0.1-SNAPSHOT",
scalaVersion := "2.11.12",
mainClass in Compile := Some("com.test.spark.job.TestSparkJob.scala")
)
val sparkVersion = "2.4.3"
libraryDependencies ++= Seq(
"net.jcazevedo" %% "moultingyaml" % "0.4.2",
"net.liftweb" %% "lift-json" % "3.4.1",
"com.github.scopt" %% "scopt" % "4.0.0-RC2",
"com.google.cloud" % "google-cloud-storage" % "1.23.0",
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
)
scalaSource in Compile := baseDirectory.value / "src"
我的 assembly.sbt
文件格式:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.10")
尽管它尚未成功构建,并且在assembly.sbt文件的右角仍显示错误消息
" Expression type def.settings[Seq[ModuleID]] must conform to Dsl entry in sbt file
"
我运行主模块时出现以下错误 'TestSparkJob'
```
java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 7 more
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main"
更新-我摆脱了 `assembly.sbt` 当我重新创建 `build.sbt` 使用相同的代码。代码在部署到dataproc集群时运行良好,但在本地环境中运行代码时仍会遇到相同的错误。
暂无答案!
目前还没有任何答案,快来回答吧!