如何解决Apache Beam:无法序列化和反序列化属性“ awsCredentialsProvider”
我正在从BigQueryTornadoes
扩展https://github.com/apache/beam
的示例。我正在进行更改,以便将其作为接收器写入AWS S3。在我的第一个迭代中,我能够使其与以下代码一起使用。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
options.setAwsCredentialsProvider(
new AWSStaticCredentialsProvider(
new BasicAWSCredentials(options.getAwsAccessKey().get(),options.getAwsSecretKey().get())));
runBigQueryTornadoes(options);
}
对于我的第二次迭代,我想与STSAssumeRoleSessionCredentialsProvider
合作以支持跨帐户IAM角色。我有以下代码。
public static void main(String[] args) {
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(new BasicAWSCredentials(options.getAwsAccessKey().get(),options.getAwsSecretKey().get()));
AWSSecurityTokenServiceClientBuilder stsBuilder = AWSSecurityTokenServiceClientBuilder.standard().withCredentials(provider);
AWSSecurityTokenService sts = stsBuilder.build();
AWSCredentialsProvider credentialsProvider = new STSAssumeRoleSessionCredentialsProvider.Builder(options.getAwsRoleArn().get(),options.getAwsRoleSession().get())
.withExternalId(options.getAwsExternalId().get())
.withStsClient(sts)
.build();
options.setAwsCredentialsProvider(credentialsProvider);
runBigQueryTornadoes(options);
}
运行上面的代码时,出现以下异常。
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Unexpected IOException (of type java.io.IOException): Failed to serialize and deserialize property 'awsCredentialsProvider' with value 'com.amazonaws.auth.STSAssumeRoleSessionCredentialsProvider@4edb24da'
at com.fasterxml.jackson.databind.JsonMappingException.fromUnexpectedIOE (JsonMappingException.java:338)
at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsBytes (ObjectMapper.java:3432)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:163)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:67)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:317)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:303)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.runBigQueryTornadoes (BigQueryTornadoesS3STS.java:251)
at org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS.main (BigQueryTornadoesS3STS.java:267)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
我运行了以下mvn
命令。
mvn compile exec:java -Dexec.mainClass=org.apache.beam.examples.cookbook.BigQueryTornadoesS3STS "-Dexec.args=..." -P direct-runner
我在Beam: Failed to serialize and deserialize property 'awsCredentialsProvider看到了类似的帖子。但是我面临的问题是没有将其包装成罐子。
解决方法
这篇帖子I am trying to write to S3 using assumeRole via FileIO with ParquetIO帮助我使代码正常工作。通过下面的代码,我可以承担跨账户IAM角色,并写入另一个AWS账户拥有的S3存储桶。
Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(new BasicAWSCredentials(options.getAwsAccessKey().get(),options.getAwsSecretKey().get()));
AWSSecurityTokenServiceClientBuilder stsBuilder = AWSSecurityTokenServiceClientBuilder.standard().withCredentials(provider);
AWSSecurityTokenService sts = stsBuilder.build();
STSAssumeRoleSessionCredentialsProvider credentials = new STSAssumeRoleSessionCredentialsProvider.Builder(options.getAwsRoleArn().get(),options.getAwsRoleSession().get())
.withExternalId(options.getAwsExternalId().get())
.withStsClient(sts)
.build();
options.setAwsCredentialsProvider(
new AWSStaticCredentialsProvider(
credentials.getCredentials()));
runBigQueryTornadoes(options);
}
注意:该代码基于BigQueryTornadoes
中的https://github.com/apache/beam
示例。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。