我有一个使用spark流的项目,我正在使用“spark submit”运行它,但遇到了以下错误:
15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:52)
at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
这是产生错误的代码,在ssc.start()之前一切正常
val Array(zkQuorum, group, topics, numThreads) = args
val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
val ssc = new StreamingContext(sparkConf, Seconds(2))
ssc.checkpoint("checkpoint")
.
.
.
ssc.start()
ssc.awaitTermination()
我已经使用“spark submit”运行了sparkpi示例,它运行得很好,所以我似乎无法找出导致我的应用程序出现问题的原因,任何帮助都将不胜感激。
1条答案
按热度按时间w6lpcovy1#
从
java.lang.AbstractMethod
:通常,这个错误被编译器捕获;只有在上次编译当前执行的方法后某个类的定义发生了不兼容的更改时,才会在运行时发生此错误。
这意味着编译依赖项和运行时依赖项之间存在版本不兼容。请确保对齐这些版本以解决此问题。