没有storage.objects.get访问权限

x8goxv8g  于 2021-05-31  发布在  Hadoop
关注(0)|答案(1)|浏览(454)

我无法解决提交作业到dataproc时gcs bucket权限问题。
下面是我要做的:
创建了一个项目
创建了一个bucket xmitya-test 创建群集:

gcloud dataproc clusters create cascade --bucket=xmitya-test \
    --master-boot-disk-size=80G --master-boot-disk-type=pd-standard \
    --num-master-local-ssds=0 --num-masters=1 \
    --num-workers=2 --num-worker-local-ssds=0 \
    --worker-boot-disk-size=80G --worker-boot-disk-type=pd-standard \
    --master-machine-type=n1-standard-2 \
    --worker-machine-type=n1-standard-2 \
    --zone=us-west1-a --image-version=1.3 \
    --properties 'hadoop-env:HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/etc/tez/conf:/usr/lib/tez/*:/usr/lib/tez/lib/*'

上载的作业jar: /apps/wordcount.jar 图书馆 /apps/lib/commons-collections-3.2.2.jar 然后用类路径中的jar提交作业:

gcloud dataproc jobs submit hadoop --cluster=cascade \
    --jar=gs:/apps/wordcount.jar \
    --jars=gs://apps/lib/commons-collections-3.2.2.jar --bucket=xmitya-test \
    -- gs:/input/url+page.200.txt gs:/output/wc.out local

然后我在访问库文件时出错:

java.io.IOException: Error accessing: bucket: apps, object: lib/commons-collections-3.2.2.jar
    at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.wrapException(GoogleCloudStorageImpl.java:1957)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getObject(GoogleCloudStorageImpl.java:1983)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getItemInfo(GoogleCloudStorageImpl.java:1870)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageFileSystem.getFileInfo(GoogleCloudStorageFileSystem.java:1156)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.getFileStatus(GoogleHadoopFileSystemBase.java:1058)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:363)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:314)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2375)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2344)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.copyToLocalFile(GoogleHadoopFileSystemBase.java:1793)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:2320)
    at com.google.cloud.hadoop.services.agent.util.HadoopUtil.download(HadoopUtil.java:70)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.downloadResources(AbstractJobHandler.java:448)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:579)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:568)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "714526773712-compute@developer.gserviceaccount.com does not have storage.objects.get access to apps/lib/commons-collections-3.2.2.jar.",
    "reason" : "forbidden"
  } ],
  "message" : "714526773712-compute@developer.gserviceaccount.com does not have storage.objects.get access to apps/lib/commons-collections-3.2.2.jar."
}
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:401)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1097)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:499)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:549)
    at com.google.cloud.hadoop.repackaged.gcs.com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getObject(GoogleCloudStorageImpl.java:1978)
    ... 23 more

已尝试将浏览器的读取权限设置为 714526773712-compute@developer.gserviceaccount.com 用户和设置对所有文件的公共权限: gsutil defacl ch -u AllUsers:R gs://xmitya-test 以及 gsutil acl ch -d allUsers:R gs://xmitya-test/** -没有效果。
原因是什么?谢谢!

u59ebvdq

u59ebvdq1#

它在抱怨访问 apps , input 以及 output 在作业提交命令的参数中指定的存储桶:
gcloud dataproc jobs submit hadoop--cluster=cascade--jar=gs:/apps/wordcount.jar--jars=gs://apps/lib/commons-collections-3.2.2.jar--bucket=xmitya test gs:/input/url+page.200.txt gs:/output/wc.out local
若要解决此问题,您需要授予对这些存储桶的访问权限,或者如果这些存储桶中有文件夹,则需要授予访问权限 xmitya-test 然后需要在路径中显式指定它: gs://xmitya-test/apps/wordcount.jar .

相关问题