问题是:
我有spark应用程序,它不能将数据写入s3。读书很好。
Spark配置:
SparkConf conf = new SparkConf();
...
conf.set("spark.hadoop.fs.s3a.endpoint", getCredentialConfig().getS3Endpoint());
System.setProperty("com.amazonaws.services.s3.enableV4", "true");// local works. enable aws v4 auth.
conf.set("spark.hadoop.fs.s3a.impl", org.apache.hadoop.fs.s3a.S3AFileSystem.class.getName());
conf.set("spark.hadoop.fs.s3a.access.key", getCredentialConfig().getS3Key());
conf.set("spark.hadoop.fs.s3a.secret.key", getCredentialConfig().getS3Secret());
conf.set("spark.hadoop.fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
conf.set("spark.hadoop.fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
...
写结构是:
String fileName = "s3a://" + getCredentialConfig().getS3Bucket() + "/s3-outputs/test/";
getSparkSession()
.createDataset(list, Encoders.INT())
.write()
.format("com.databricks.spark.csv")
.mode("overwrite")
.csv(fileName);
例外情况是:
10:35:01.914 [main] DEBUG org.apache.hadoop.fs.s3a.S3AFileSystem - Not Found: s3a://mybucket/s3-outputs/test/_temporary-39c4ebc3-61bd-47e0-9ac6-d047af1965f3
10:35:01.914 [main] DEBUG org.apache.hadoop.fs.s3a.S3AFileSystem - Couldn't delete s3a://mybucket/s3-outputs/test/_temporary-39c4ebc3-61bd-47e0-9ac6-d047af1965f3 - does not exist
这意味着spark在目标文件系统上找不到模板文件夹。
当前hadoop版本:2.7.3
java 8
在hadoop2.8.1上-一切正常。但目前aws emr不支持Hadoop2.8.*版本。
暂无答案!
目前还没有任何答案,快来回答吧!