无法解析spark structured streaming kafka依赖项

yqlxgs2m  于 2021-06-06  发布在  Kafka
关注(0)|答案(1)|浏览(614)

我试过了

./spark-2.3.1-bin-hadoop2.7/bin/spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1 test.py

在我自己的电脑里,一切都很好。但在我尝试在我的学校的服务器,它有以下消息和错误。我在谷歌搜索了很长一段时间,完全不知道。有人能帮我吗?
ivy默认缓存设置为:/home/zqwang/.ivy2/缓存存储在:/home/zqwang/.ivy2/jars::loading settings::url=jar:file:/data/opt/tmp/zqwang/spark-2.3.1-bin-hadoop2.7/jars/ivy-2.4.0.jar/org/apache/ivy/core/settings/ivysettings.xml org.apache.spark#spark-sql-kafka-0-10_.11作为依赖项添加::解析依赖项::org.apache.spark#spark-submit-parent-26b526c6-0535-4007-8428-e38188af5709;1.0 confs:[默认]::解析报告::解析966ms::工件dl 0ms::正在使用的模块:
||模块| |工件| |配置|编号|搜索|删除|编号|删除|
|默认值| 1 | 0 | 0 | 0 | 0|
●问题摘要:::未找到警告模块:org.apache.spark#spark-sql-kafka-0-10_.11;2.3.1
===local-m2-cache:已尝试
文件:/home/zqwang/.m2/repository/org/apache/spark/spark-sql-kafka-0-10\u 2.11/2.3.1/spark-sql-kafka-0-10\u 2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10Šu 2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:
文件:/home/zqwang/.m2/repository/org/apache/spark/spark-sql-kafka-0-10\u 2.11/2.3.1/spark-sql-kafka-0-10\u 2.11-2.3.1.jar
===本地常春藤缓存:已尝试
/home/zqwang/.ivy2/local/org.apache.spark/spark-sql-kafka-0-10\u 2.11/2.3.1/ivys/ivy.xml

-- artifact

org.apache.spark#spark-sql-kafka-0-10Šu 2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:
/home/zqwang/.ivy2/local/org.apache.spark/spark-sql-kafka-0-10\u 2.11/2.3.1/jars/spark-sql-kafka-0-10\u 2.11.jar
===中心:已尝试
https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10Šu 2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:
https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar
===spark软件包:试用
http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom

-- artifact

org.apache.spark#spark-sql-kafka-0-10Šu 2.11;2.3.1!spark-sql-kafka-0-10_2.11.jar:
http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar

::::::::::::::::::::::::::::::::::::::::::::::

  ::          UNRESOLVED DEPENDENCIES         ::

  ::::::::::::::::::::::::::::::::::::::::::::::

  :: org.apache.spark#spark-sql-kafka-0-10_2.11;2.3.1: not found

  ::::::::::::::::::::::::::::::::::::::::::::::

:::错误url处的服务器访问错误https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.connectexception:连接被拒绝)
url处的服务器访问错误https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar (java.net.connectexception:连接被拒绝)
url处的服务器访问错误http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.connectexception:连接被拒绝)
url处的服务器访问错误http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.jar (java.net.connectexception:连接被拒绝)
:使用详细或调试消息级别了解线程“main”java.lang.runtimeexception中异常的详细信息:[未解析的依赖项:org.apache.spark#spark-sql-kafka-0-10_.11;2.3.1:找不到]在org.apache.spark.deploy.sparksubmitils$.resolvemavencoordinates(sparksubmit。scala:1303)在org.apache.spark.deploy.dependencyutils$.resolvemavendependencies(dependencyutils。scala:53)在org.apache.spark.deploy.sparksubmit$.doprepareResubmitenvironment(sparksubmit。scala:364)在org.apache.spark.deploy.sparksubmit$.preparesubmitenvironment(sparksubmit。scala:250)在org.apache.spark.deploy.sparksubmit$.submit(sparksubmit。scala:171)位于org.apache.spark.deploy.sparksubmit$.main(sparksubmit.com)。scala:137)位于org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)

qkf9rpyu

qkf9rpyu1#

但在我尝试在我的学校的服务器,它有以下消息和错误
你们学校有防火墙防止远程软件包被下载。
例如,这个链接对我很有用
url处的服务器访问错误https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_2.11/2.3.1/spark-sql-kafka-0-10_2.11-2.3.1.pom (java.net.connectexception:连接被拒绝)
你需要在校外下载Kafkajar,然后使用 --jars 与他们一起提交的标志

相关问题