连接到s3 bucket时发生presto错误400

gzszwxb4  于 2021-06-24  发布在  Hive
关注(0)|答案(0)|浏览(753)

概述
我在kubernetes上部署了一个presto集群,并尝试通过一个兼容的api将其连接到s3 bucket。
如何繁殖
我使用presto操作符(通过starburst)并使用以下属性配置presto资源:

hive:
    additionalProperties: |
      connector.name=hive-hadoop2
      hive.metastore.uri=thrift://hive-metastore-presto-cluster-name.default.svc.cluster.local:9083
      hive.s3.endpoint=https://<s3-endpoint>

我添加了s3密码,数据包括aws\u access\u key\u id:和aws\u secret\u access\u key:*
然后,我使用presto cli通过以下命令创建架构:

CREATE SCHEMA ban WITH (LOCATION = 's3a://path/to/schema');

错误
它在配置单元元存储中抛出一个错误,如:

Got exception: org.apache.hadoop.fs.s3a.AWSBadRequestException doesBucketExist on presto: com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request

问题识别
我在这个链接上看到“当尝试使用任何只支持“v4”签名api的s3服务时会发生这种情况-但是客户端被配置为使用默认的s3服务端点”
解决方案是:“需要通过fs.s3a.endpoint属性为s3a客户机提供要使用的端点”
但是属性fs.s3a.endpoint不适用于presto的配置单元配置。
有人已经有这个问题了吗?
有用的细节
kubernetes版本:1.18
presto操作符图像:starburstdata/presto-operator:341-e-k8s-0.35
---编辑---
完整堆栈跟踪

2020-09-29T12:20:34,120 ERROR [pool-7-thread-2] utils.MetaStoreUtils: Got exception: org.apache.hadoop.fs.s3a.AWSBadRequestException doesBucketExist on presto: com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: txc7f0218a02574ba59ec91-005f732692; S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692), S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692:400 Bad Request: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: txc7f0218a02574ba59ec91-005f732692; S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692)
org.apache.hadoop.fs.s3a.AWSBadRequestException: doesBucketExist on presto: com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: txc7f0218a02574ba59ec91-005f732692; S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692), S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692:400 Bad Request: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: txc7f0218a02574ba59ec91-005f732692; S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692)
        at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:212) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:111) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$3(Invoker.java:260) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:317) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:256) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:231) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:372) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:308) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361) ~[hadoop-common-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:115) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:141) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:147) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.Warehouse.determineDatabasePath(Warehouse.java:190) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database_core(HiveMetaStore.java:1265) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:1425) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_232]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_232]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_232]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_232]
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at com.sun.proxy.$Proxy26.create_database(Unknown Source) [?:?]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:14861) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:14845) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:104) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) [hive-exec-3.1.0.3.1.0.0-78.jar:3.1.0.3.1.0.0-78]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_232]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_232]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_232]
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Bad Request (Service: Amazon S3; Status Code: 400; Error Code: 400 Bad Request; Request ID: txc7f0218a02574ba59ec91-005f732692; S3 Extended Request ID: txc7f0218a02574ba59ec91-005f732692)
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1639) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1056) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4325) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4272) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1337) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1277) ~[aws-java-sdk-bundle-1.11.271.jar:?]
        at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$verifyBucketExists$1(S3AFileSystem.java:373) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:109) ~[hadoop-aws-3.1.1.3.1.0.0-78.jar:?]
        ... 33 more

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题