(windows 10)异常:java网关进程在发送其端口号之前退出

ifsvaxew  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(306)

明白这是一个经常被问到的问题,但我已经设置了我的环境和东西(附图片)。运行windows10需要pyspark作为课程模块。请帮忙。。。
运行后的错误消息:

  1. import pyspark
  2. from pyspark.sql import SparkSession
  3. spark = SparkSession.builder.config("spark-master","local").getOrCreate()

以下是回溯:

  1. Exception Traceback (most recent call last)
  2. <ipython-input-3-12e393ff7f76> in <module>
  3. 1 import pyspark
  4. 2 from pyspark.sql import SparkSession
  5. ----> 3 spark = SparkSession.builder.config("spark-master","local").getOrCreate()
  6. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\sql\session.py in getOrCreate(self)
  7. 171 for key, value in self._options.items():
  8. 172 sparkConf.set(key, value)
  9. --> 173 sc = SparkContext.getOrCreate(sparkConf)
  10. 174 # This SparkContext may be an existing one.
  11. 175 for key, value in self._options.items():
  12. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\context.py in getOrCreate(cls, conf)
  13. 365 with SparkContext._lock:
  14. 366 if SparkContext._active_spark_context is None:
  15. --> 367 SparkContext(conf=conf or SparkConf())
  16. 368 return SparkContext._active_spark_context
  17. 369
  18. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\context.py in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
  19. 131 " note this option will be removed in Spark 3.0")
  20. 132
  21. --> 133 SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
  22. 134 try:
  23. 135 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
  24. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\context.py in _ensure_initialized(cls, instance, gateway, conf)
  25. 314 with SparkContext._lock:
  26. 315 if not SparkContext._gateway:
  27. --> 316 SparkContext._gateway = gateway or launch_gateway(conf)
  28. 317 SparkContext._jvm = SparkContext._gateway.jvm
  29. 318
  30. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\java_gateway.py in launch_gateway(conf)
  31. 44 :return: a JVM gateway
  32. 45 """
  33. ---> 46 return _launch_gateway(conf)
  34. 47
  35. 48
  36. C:\opt\spark\spark-2.4.6-bin-hadoop2.7\python\pyspark\java_gateway.py in _launch_gateway(conf, insecure)
  37. 106
  38. 107 if not os.path.isfile(conn_info_file):
  39. --> 108 raise Exception("Java gateway process exited before sending its port number")
  40. 109
  41. 110 with open(conn_info_file, "rb") as info:
  42. Exception: Java gateway process exited before sending its port number


暂无答案!

目前还没有任何答案,快来回答吧!

相关问题