将文件从s3://复制到本地(hadoop)文件系统时出错

kxkpmulp  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(463)

我正在尝试使用python将文件从s3复制到hadoop文件系统。我得到以下错误:

cp: `foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz': No such file or directory

我最近正在迁移最新的hadoop版本(2.4.0)。在版本(0.20)中运行良好。为什么在2.4.0版本中会出现此错误?
在hadoop版本0.20中

hadoop@ip-10-76-38-167:~$ /home/hadoop/bin/hadoop fs -cp s3://test.com/foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz /foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz

15/02/13 11:21:45 INFO s3native.NativeS3FileSystem: Opening 's3://test.com/foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz' for reading

在hadoop版本2.4.0中

[hadoop@ip-10-169-19-123 ~]$ /home/hadoop/bin/hadoop fs -cp s3://test.com/foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz /foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz

15/02/13 11:21:37 INFO guice.EmrFSBaseModule: Consistency disabled, using com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem as FileSystem implementation.

15/02/13 11:21:38 INFO fs.EmrFileSystem: Using com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem as filesystem implementation

cp: `foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz': No such file or directory
cunj1qz1

cunj1qz11#

我自己找到了答案。

Using `distcp` instead of `fs -cp`.

这个命令没有任何问题。

wr98u20j

wr98u20j2#

你得这样试试。“添加”

/home/hadoop/bin/hadoop fs -cp "s3://test.com/foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz" "/foo/ds=2015-02-13/ip-d1b-request-2015-02-13_10-00_10-09.txt.gz"

相关问题