亲爱的用户和编码人员,
pyspark 3.0.1和python 3.6
运行简单单机版时出错 Traceback (most recent call last): File "C:\Users\bru\eclipse-workspace\SPARK\test.py", line 8, in <module> sc = SparkContext(conf=conf) File "C:\Users\bru\AppData\Local\Programs\Python\Python36\lib\site-packages\pyspark\context.py", line 133, in __init__ SparkContext._ensure_initialized(self, gateway=gateway, conf=conf) File "C:\Users\bru\AppData\Local\Programs\Python\Python36\lib\site-packages\pyspark\context.py", line 325, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway(conf) File "C:\Users\bru\AppData\Local\Programs\Python\Python36\lib\site-packages\pyspark\java_gateway.py", line 105, in launch_gateway raise Exception("Java gateway process exited before sending its port number") Exception: Java gateway process exited before sending its port number
代码如下:
from pyspark import SparkConf
from pyspark import SparkContext
conf = SparkConf()
conf.setMaster('local')
conf.setAppName('spark-basic')
sc = SparkContext(conf=conf)
def mod(x):
import numpy as np
return(x, np.mod(x, 2))
rdd = sc.parallelize(range(1000)).map(mod).take(10)
print(rdd)
java\u home=c:\progra~1\java\jdk1.8.0\u 201
spark\u home=c:\spark-3.0.1-bin-hadoop2.7
你能帮我个忙吗(我是新手)?非常感谢你!
暂无答案!
目前还没有任何答案,快来回答吧!