在ubuntu下提交将kafka与spark集成的jar文件时出现nullpointerexception。我正在试着运行代码https://github.com/apache/spark/tree/v2.1.1/examples
我试着检查在ubuntu下安装spark是否需要设置hadoop\u home;但是,hadoop\u home没有设置,仔细检查了jar的参数。
./bin/spark-submit --class "org.apache.spark.examples.streaming.JavaKafkaWordCount" --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.1.0 --master local[*] --jars ~/software/JavaKafkaWordCount.jar localhost:2181 test-consumer-group streams-plaintext-input 1
org.apache.hadoop.fs.path.getname(path。java:337)在org.apache.spark.deploy.dependencyutils$.downloadfile(dependencyutils。scala:136)在org.apache.spark.deploy.sparksubmit$$anonfun$preparesubmitenvironment$7.apply(sparksubmit。scala:367)在org.apache.spark.deploy.sparksubmit$$anonfun$preparesubmitenvironment$7.apply(sparksubmit。scala:367)在scala.option.map(option。scala:146)在org.apache.spark.deploy.sparksubmit.preparesubmitenvironment(sparksubmit。scala:366)在org.apache.spark.deploy.sparksubmit.submit(sparksubmit。scala:143)在org.apache.spark.deploy.sparksubmit.dosubmit(sparksubmit。scala:86)在org.apache.spark.deploy.sparksubmit$$anon$2.dosubmit(sparksubmit。scala:924)位于org.apache.spark.deploy.sparksubmit$.main(sparksubmit.com)。scala:933)位于org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)
1条答案
按热度按时间9udxz4iz1#
您的路径uri jar不可理解,请参见dependencyutils.scala\l136
在你的星星之火中,你可以像这样改变参数
--jars /fullpath/JavaKafkaWordCount.jar
而不是--jars ~/software/JavaKafkaWordCount.jar