spark作业错误原因:java.net.socketexception:地址错误(ioctl(siocgifconf)失败)

cgh8pdjw  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(294)

最初在我的linux服务器上,我在本地用spark submit运行作业,并且我能够启动这些作业。
我好像是在做了一次测试之后犯了这个错误

ssh -D 8080 -C -N user@server

从我的笔记本电脑到这个服务器。当我执行非常简单的python代码时,如下所示:

from pyspark import SparkContext
sc = SparkContext("local[2]", "test")

这里有两个错误消息

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.SparkConf$.<init>(SparkConf.scala:668)
    at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
    at org.apache.spark.SparkConf$$anonfun$getOption$1.apply(SparkConf.scala:375)
    at org.apache.spark.SparkConf$$anonfun$getOption$1.apply(SparkConf.scala:375)
    at scala.Option.orElse(Option.scala:289)
    at org.apache.spark.SparkConf.getOption(SparkConf.scala:375)
    at org.apache.spark.SparkConf.get(SparkConf.scala:250)
    at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopConfigurations(SparkHadoopUtil.scala:473)
    at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:446)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$1.apply(SparkSubmit.scala:383)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$1.apply(SparkSubmit.scala:383)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(SparkSubmit.scala:383)
    at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:250)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.SocketException: Bad address (ioctl(SIOCGIFCONF) failed)
    at java.net.NetworkInterface.getAll(Native Method)
    at java.net.NetworkInterface.getNetworkInterfaces(NetworkInterface.java:355)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:922)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:908)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:908)
    at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:965)
    at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:965)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localCanonicalHostName(Utils.scala:965)
    at org.apache.spark.internal.config.package$.<init>(package.scala:282)
    at org.apache.spark.internal.config.package$.<clinit>(package.scala)
    ... 17 more

另一个是从python获得的,因为我试图通过python激发我的工作。

Traceback (most recent call last):
  File "***.py", line 19, in <module>
    sc = SparkContext("local[2]", "test")
  File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 133, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
  File "/usr/local/lib/python3.6/site-packages/pyspark/context.py", line 316, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/usr/local/lib/python3.6/site-packages/pyspark/java_gateway.py", line 46, in launch_gateway
    return _launch_gateway(conf)
  File "/usr/local/lib/python3.6/site-packages/pyspark/java_gateway.py", line 108, in _launch_gateway
    raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

不太明白我修改了什么。看来我的python代码找不到提交作业的网关了。提前感谢您对我的启发。:)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题