我已经用SBT和Scala在intelliJ中构建了一个Spark项目,并编写了代码:
import org.apache.spark.sql.SparkSession
object Main {
def main(args: Array[String]): Unit = {
val spark =
SparkSession
.builder()
.master("local[1]")
.appName("SparkByExample")
.getOrCreate();
}
}
并得到错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SparkSession$
at com.bigdata.Main$.main(Main.scala:6)
at com.bigdata.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SparkSession$
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 2 more
我建筑。STB:
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.15"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.3.0" % "provided"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.3.0"
lazy val root = (project in file("."))
.settings(
name := "untitled",
idePackagePrefix := Some("com.bigdata")
)
我在路径中看到SPARK_HOME,它很好,我尝试打开另一个项目,但仍然有问题
1条答案
按热度按时间hivapdat1#
您将
spark-sql
依赖项定义为provided
,因此在本地执行代码时,默认情况下它不会出现在类路径中。如果您的目的只是测试,您可以移除提供的范围,或者您也可以设定您的IDE,让它也包含提供的相依性。在IntelliJ中,您可以从执行组态选项来完成这项工作。