我有以下项目结构,
work_directory:
merge.py
a_package
(即python文件) merge.py
还有一个目录 a_package
在“工作目录”下
我在merge.py中使用mrjob编写了一个mapreduce作业,需要在其中导入 a_package
喜欢 from a_package import something
. 但我上传有困难 a_package
进入hadoop。
我试过这种方法(https://mrjob.readthedocs.io/en/latest/guides/writing-mrjobs.html#using-其他python模块和包):我写了
class MRPackageUsingJob(MRJob):
DIRS = ['a_package']
并从Map器内部导入代码
def mapper(self, key, value):
from a_package import something
我也试过这个:https://mrjob.readthedocs.io/en/latest/guides/setup-cookbook.html#uploading-您的源代码树
但它们都不起作用,这一点一直在显现 ImportError: No module named a_package
.
我该怎么办?
1条答案
按热度按时间1hdlvixo1#
您只需在文件夹中创建空文件“\uuuu init\uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu。例如: