spark submit失败,代码为spark streaming workdcount python

kzipqqlq  于 2021-06-08  发布在  Kafka
关注(0)|答案(2)|浏览(844)

我刚刚复制了spark streaming wodcount python代码,并使用spark submit在spark cluster中运行wordcount python代码,但它显示了以下错误:

py4j.protocol.Py4JJavaError: An error occurred while calling o23.loadClass.
: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

我确实构建了jar spark-streaming-kafka-assembly\u2.10-1.4.0-snapshot.jar。我使用以下脚本提交:bin/spark submit/data/spark-1.3.0-bin-hadoop2.4/wordcount.py--masterspark://192.168.100.6:7077--jars/data/spark-1.3.0-bin-hadoop2.4/kafka-assembly/target/spark-streaming-kafka-assembly.*.jar。
提前谢谢!

nxagd54h

nxagd54h1#

我必须在命令中引用许多jar才能让它工作,可能尝试显式地引用jar,它可能无法从您构建的jar中正确地获取它。

/opt/spark/spark-1.3.1-bin-hadoop2.6/bin/spark-submit --jars /root/spark-streaming-kafka_2.10-1.3.1.jar,/usr/hdp/2.2.4.2-2/kafka/libs/kafka_2.10-0.8.1.2.2.4.2-2.jar,/usr/hdp/2.2.4.2-2/kafka/libs/zkclient-0.3.jar,/root/.m2/repository/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar  kafka_wordcount.py kafkaAddress:2181 topicName

实际上它看起来好像没有拿起这个jar:Kafka2.10-0.8.1.2.2.4.2-2.jar

lbsnaicq

lbsnaicq2#

实际上,我刚意识到你在剧本后面加了--jar。除非在脚本名之前指定jar,否则不会包含jar文件。所以使用spark submit--jars spark-streaming-kafka-assembly_2.10-1.3.1.jar script.py而不是spark submit script.py--jars spark-streaming-kafka-assembly_2.10-1.3.1.jar。

相关问题