无法用bahir启动spark应用程序

jbose2ul  于 2021-05-16  发布在  Spark
关注(0)|答案(1)|浏览(433)

我正在尝试在scala中运行spark应用程序来连接activemq。我用bahir来做这个 format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider") . 当我使用 Bahir2.2 在我的 built.sbt 应用程序运行正常,但正在将其更改为 Bahir3.0 或者 Bahir4.0 应用程序未启动,并且出现错误:

[error] (run-main-0) java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream`

如何解决这个问题?我可以在spark结构化流媒体中使用bahir来连接activemq主题吗?
编辑:my build.sbt

//For spark
libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "2.4.0" ,
    "org.apache.spark" %% "spark-mllib" % "2.4.0" ,
    "org.apache.spark" %% "spark-sql" % "2.4.0" ,
    "org.apache.spark" %% "spark-hive" % "2.4.0" ,
    "org.apache.spark" %% "spark-streaming" % "2.4.0" ,
    "org.apache.spark" %% "spark-graphx" % "2.4.0",
)

//Bahir
libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.4.0"
8wigbo56

8wigbo561#

好吧,看来这两者之间存在某种兼容性问题 spark2.4 以及 bahir2.4 . 我把他们两个都倒回去 ver 2.3 .
这是我的 build.sbt ```
name := "sparkTest"

version := "0.1"

scalaVersion := "2.11.11"

//For spark
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0" ,
"org.apache.spark" %% "spark-mllib" % "2.3.0" ,
"org.apache.spark" %% "spark-sql" % "2.3.0" ,
"org.apache.spark" %% "spark-hive" % "2.3.0" ,
"org.apache.spark" %% "spark-streaming" % "2.3.0" ,
"org.apache.spark" %% "spark-graphx" % "2.3.0",
// "org.apache.spark" %% "spark-streaming-kafka" % "1.6.3",
)

//Bahir
libraryDependencies += "org.apache.bahir" %% "spark-sql-streaming-mqtt" % "2.3.0"

相关问题