amazon emr上的pig-将输出存储到文件时出错

ar5n3qh5  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(328)

我目前正在amazonelasticmapreduce上使用pig,并尝试执行一个简单的任务:从s3输入一些数据,转换它并将其输出到一个文件。但是,当我使用store命令时,我遇到了一些问题。
这是我的密码:

cp s3://stackexchangedata/Data/Query_1-50000.csv file:///home/hadoop
REGISTER 'file:///home/hadoop/piggybank.jar'
RAW_LOGS1 = LOAD 'file:///home/hadoop/Query_1-50000.csv' USING org.apache.pig.piggybank.storage.CSVExcelStorage(',', 'YES_MULTILINE') as (Id:Long, PostTypeID:chararray, AcceptedAnswerID:chararray, ParentID:chararray, CreationDate:chararray, DeletionDate:chararray,  Score:long, ViewCount:long, Body:chararray, OwnerUserID:chararray, OwnerDisplayName:chararray, LastEditorUserId:chararray, LastEditorDisplayName:chararray, LastEditDate:chararray, LastActivityDate:chararray, Title:chararray, Tags:chararray, AnswerCount:int, CommentCount:int, FavoriteCount:int, ClosedDate:chararray, CommunityOwnedDate:chararray);

RAW_LOGS1A = FOREACH RAW_LOGS1 GENERATE $0, $1, $2, $3, $4, $5, $6, $7, REPLACE(Body, '\n','') AS Body_Clean, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21;

STORE RAW_LOGS1A INTO 'file:///home/hadoop/test/';

寄存器,加载和生成命令似乎工作,但存储命令不工作-有很多输出,所以我刚刚复制了警告和错误位。

16/03/20 21:03:22 WARN mapreduce.JobResourceUploader: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).

org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Wrong FS: hdfs://ip-172-31-21-40.eu-west-1.compute.internal:8020/user/hadoop, expected: file:///
   at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
    at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
    at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:194)
    at java.lang.Thread.run(Thread.java:745)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://ip-172-31-21-40.eu-west-1.compute.internal:8020/user/hadoop, expected: file:///
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:650)
    at org.apache.hadoop.fs.RawLocalFileSystem.setWorkingDirectory(RawLocalFileSystem.java:547)
    at org.apache.hadoop.fs.FilterFileSystem.setWorkingDirectory(FilterFileSystem.java:290)
    at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:235)
    ... 18 more

有人能帮忙吗?我对hadoop非常陌生,所以非常感谢您的帮助。
提前谢谢。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题