我正在扩展 BigQueryTornadoes
示例来自 https://github.com/apache/beam
. 我正在做一个更改,以便它可以作为一个接收器写入awss3。在我的第一次迭代中,我能够使用下面的代码。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
options.setAwsCredentialsProvider(
new AWSStaticCredentialsProvider(
new BasicAWSCredentials(options.getAwsAccessKey().get(), options.getAwsSecretKey().get())));
runBigQueryTornadoes(options);
}
在我的第二次迭代中,我想与 STSAssumeRoleSessionCredentialsProvider
支持跨账户iam角色。我有以下代码。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(new BasicAWSCredentials(options.getAwsAccessKey().get(), options.getAwsSecretKey().get()));
AWSSecurityTokenServiceClientBuilder stsBuilder = AWSSecurityTokenServiceClientBuilder.standard().withCredentials(provider);
AWSSecurityTokenService sts = stsBuilder.build();
AWSCredentialsProvider credentialsProvider = new STSAssumeRoleSessionCredentialsProvider.Builder(options.getAwsRoleArn().get(), options.getAwsRoleSession().get())
.withExternalId(options.getAwsExternalId().get())
.withStsClient(sts)
.build();
options.setAwsCredentialsProvider(credentialsProvider);
runBigQueryTornadoes(options);
}
当我运行上面的代码时,我得到以下异常。
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Unexpected IOException (of type java.io.IOException): Failed to serialize and deserialize property 'awsCredentialsProvider' with value 'com.amazonaws.auth.STSAssumeRoleSessionCredentialsProvider@4edb24da'
at com.fasterxml.jackson.databind.JsonMappingException.fromUnexpectedIOE (JsonMappingException.java:338)
at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsBytes (ObjectMapper.java:3432)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:163)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:67)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:317)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:303)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.runBigQueryTornadoes (BigQueryTornadoesS3STS.java:251)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.main (BigQueryTornadoesS3STS.java:267)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
我和下面的人一起跑 mvn
命令。
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS "-Dexec.args=..." -P direct-runner
我在beam上看到了类似的帖子:未能序列化和反序列化属性“awscredentialsprovider”。但我面对的问题不是把它 Package 成jar。
1条答案
按热度按时间r1zhe5dt1#
这篇文章我试图写s3使用assumerole通过fileio与parquetio帮助我使我的代码工作。通过下面的代码,我可以承担跨账户iam的角色,并写入另一个aws账户拥有的s3存储桶。
注:代码基于
BigQueryTornadoes
示例来自https://github.com/apache/beam
.