我想直接在Databricks上安装Kafka Broker。
目的只是使用Spark Streaming演示本地Kafka用例。
下面是我如何进行的:
wget https://packages.confluent.io/archive/7.4/confluent-7.4.0.tar.gz
tar -xvf confluent-7.4.0.tar.gz
./confluent-7.4.0/bin/zookeeper-server-start ./etc/kafka/zookeeper.properties&
./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&
字符串
然而,最后一个命令抛出了一个奇怪的错误:
./confluent-7.4.0/bin/kafka-server-start ./etc/kafka/server.properties&
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/databricks/jars/----ws_3_3--mvn--hadoop3--org.apache.logging.log4j--log4j-slf4j-impl--org.apache.logging.log4j__log4j-slf4j-impl__2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/databricks/driver/confluent-7.4.0/share/java/kafka/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
[2023-07-06 12:48:45,041] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2023-07-06 12:48:45,629] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Ljava/lang/Object;
at kafka.Kafka$.getPropsFromArgs(Kafka.scala:42) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka$.main(Kafka.scala:91) ~[kafka_2.13-7.4.0-ce.jar:?]
at kafka.Kafka.main(Kafka.scala) ~[kafka_2.13-7.4.0-ce.jar:?]
Exception in thread "main" java.lang.NoSuchMethodError: scala.Option.orNull(Lscala/$less$colon$less;)Ljava/lang/Object;
at kafka.utils.Exit$.exit(Exit.scala:28)
at kafka.Kafka$.main(Kafka.scala:127)
at kafka.Kafka.main(Kafka.scala)`
型
你知道什么会引发这个吗?谢啦,谢啦
2条答案
按热度按时间jaql4c8m1#
错误是说您使用的是不正确的Scala版本
尝试下载2.12版本(如果可用)。您也不需要Confluent Platform
mzillmmw2#
@OneCricketeer的回答让我再次检查Scala / Kafka版本兼容性
Indeed:“自Apache Kafka 3.0以来,Scala 2.12支持已被弃用,并将在Apache Kafka 4.0中删除”
NOT中的数据库支持Scala 2.13(* 尚未 *)
所以我坚持使用Kafka,因为它是最后一个默认支持2.12的版本
https://index.scala-lang.org/apache/kafka/artifacts/kafka的
x1c 0d1x的数据
我根本不需要confluent,但是just in case:
我用5.2版本得到了2.2版的Kafka,并且让它工作了
的