spark python api java\ U网关套接字连接错误

xriantvc  于 2021-06-21  发布在  Mesos
关注(0)|答案(1)|浏览(323)

我的群集是 Spark-0.7.2 + Mesos-0.9 . 我用python编写了一个spark程序,在本地模式下运行良好。但在mesos上运行时出现了一些错误。以下是错误信息:

13/09/30 15:40:13 INFO TaskSetManager: Finished TID 13 in 242 ms (progress: 2/3)
13/09/30 15:40:13 INFO DAGScheduler: Completed ResultTask(4, 1)
send
Exception in thread "DAGScheduler" spark.SparkException: EOF reached before Python server acknowledged
        at spark.api.python.PythonAccumulatorParam.addInPlace(PythonRDD.scala:303)
        at spark.api.python.PythonAccumulatorParam.addInPlace(PythonRDD.scala:278)
        at spark.Accumulable.$plus$plus$eq(Accumulators.scala:52)
        at spark.Accumulators$$anonfun$add$2.apply(Accumulators.scala:235)
        at spark.Accumulators$$anonfun$add$2.apply(Accumulators.scala:233)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:93)
        at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:93)
        at scala.collection.Iterator$class.foreach(Iterator.scala:660)
        at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
        at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:43)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:93)
        at spark.Accumulators$.add(Accumulators.scala:233)
        at spark.scheduler.DAGScheduler.handleTaskCompletion(DAGScheduler.scala:494)
        at spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:300)
        at spark.scheduler.DAGScheduler.spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:364)
        at spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:107)
13/09/30 15:40:13 INFO TaskSetManager: Finished TID 12 in 407 ms (progress: 3/3)

不是每次都这样。看来插座连接不稳定。有人能建议如何解决这个问题吗?

gojuced7

gojuced71#

我通过将java8更新为u91解决了这个问题

相关问题