Pypark中屏蔽的模块无法访问

g6ll5ycj  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(193)

我是pyspark的新手。我在试着运行pyspark代码。我运行了一个名为“time.py”的代码,因此pyspark现在无法运行。我得到下面的错误。

Traceback (most recent call last):
  File "/home/VAL_CODE/test.py", line 1, in <module>
    from pyspark import SparkContext,HiveContext,SparkConf
  File "/opt/cloudera/parcels/CDH-6.3.3-1.cdh6.3.3.p0.1796617/lib/spark/python/lib/pyspark.zip/pyspark/__init__.py", line 51, in <module>
  File "/opt/cloudera/parcels/CDH-6.3.3-1.cdh6.3.3.p0.1796617/lib/spark/python/lib/pyspark.zip/pyspark/context.py", line 24, in <module>
  File "/usr/lib64/python2.7/threading.py", line 14, in <module>
    from time import time as _time, sleep as _sleep
  File "/home/VAL_CODE/time.py", line 1, in <module>
ImportError: cannot import name SparkContext
20/08/13 19:04:16 INFO util.ShutdownHookManager: Shutdown hook called
20/08/13 19:04:16 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-f104da1f-ba70-4c45-8a19-6ffc55b609aa

错误在于脚本“/usr/lib64/python2.7/threading.py”引用的是我创建的本地脚本“/home/val\u code/time.py”。现在,我已经删除了“/home/valu code/time.py”脚本。但是当我运行一个新代码“/home/valu code/test.py”时仍然面临这个问题。请帮忙解决。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题