spark shell错误:值生成器不是对象com.amazonaws.services.s3.model.putobjectrequest的成员

oxcyiej7  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(823)

我刚刚开始使用emr hadoop/spark等,我正在尝试使用sparkshell运行scala代码,将文件上传到emrfs s3位置,但是我收到以下错误-
如果我运行=>

val bucketName = "bucket"
val outputPath = "test.txt"

scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:27: error: not found: value PutObjectRequest
   val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
                    ^

一旦我为putobjectrequest添加了导入包,我仍然会得到一个不同的错误。

scala> import com.amazonaws.services.s3.model.PutObjectRequest

导入com.amazonaws.services.s3.model.putobjectrequest

scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:28: error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
   val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
                                     ^

我不确定我错过了什么。任何帮助都将不胜感激!
注:spark版本为2.4.5

uidvcgyl

uidvcgyl1#

不使用生成器,而是通过合适的构造函数创建putobjectrequest的对象。另外,使用amazons3clientbuilder创建到s3的连接。

import com.amazonaws.regions.Regions
import com.amazonaws.services.s3.AmazonS3ClientBuilder
import com.amazonaws.services.s3.model.ObjectMetadata
import com.amazonaws.services.s3.model.PutObjectRequest

import java.io.File

val clientRegion = Regions.DEFAULT_REGION
val bucketName = "***Bucket name***"
val fileObjKeyName = "***File object key name***"
val fileName = "***Path to file to upload***"

val s3Client = AmazonS3ClientBuilder.standard.withRegion(clientRegion).build

// Upload a file as a new object with ContentType and title specified.
val request = new PutObjectRequest(bucketName, fileObjKeyName, new File(fileName))
val metadata = new ObjectMetadata()
metadata.setContentType("plain/text")
metadata.addUserMetadata("title", "someTitle")
request.setMetadata(metadata)
s3Client.putObject(request)

相关问题