java 如何在Apache Beam中将PCollection转换< TableRow>为PCollection< Row>?

qvsjd97n  于 2023-01-01  发布在  Java
关注(0)|答案(1)|浏览(111)

我有一个TableRowPCollection。我想在那个PCollection上使用SQLTransformation。我试图在PCollection<TableRow>上直接使用SQLTransformation,但是它给出了下面的错误:

    • 代码段**
PCollection<TableRow> rows = [...]
PCollection<Row> rows1 =   rows.apply(SqlTransform.query("SELECT max(ID) as max_watermark FROM PCOLLECTION"));
    • 错误**
[main] WARN org.apache.beam.sdk.io.jdbc.JdbcIO - Unable to infer a schema for type com.google.api.services.bigquery.model.TableRow. Attempting to infer a coder without a schema.
java.lang.IllegalStateException: Cannot call getSchema when there is no schema
        at org.apache.beam.sdk.values.PCollection.getSchema(PCollection.java:331)
        at org.apache.beam.sdk.extensions.sql.impl.schema.BeamPCollectionTable.<init>(BeamPCollectionTable.java:35)
        at org.apache.beam.sdk.extensions.sql.SqlTransform.toTableMap(SqlTransform.java:183)
        at org.apache.beam.sdk.extensions.sql.SqlTransform.expand(SqlTransform.java:138)
        at org.apache.beam.sdk.extensions.sql.SqlTransform.expand(SqlTransform.java:110)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:548)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:482)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:360)
        at org.example.Main.main(Main.java:162)

根据文档,SQLTransformation需要PCollection<Row>作为输入,因此我尝试使用以下逻辑将PCollection<TableRow>转换为PCollection<Row>,但这会导致编码器错误。
"我试过逻辑"

final Schema schema = Schema.builder()
    .addStringField("ID")
    .build();

PCollection<Row> rows11 =  rows.apply(ParDo.of(new DoFn<TableRow, Row>() {
    @ProcessElement
    public void ProcessElement(@Element TableRow inRow, OutputReceiver<Row> out){
            Row r = Row.withSchema(schema)
                    .addValues(inRow.get("ID"))
                    .build();
            out.output(r);          
    } 
}));

PCollection<Row> rows12 =   rows11.apply(SqlTransform.query("SELECT max(ID) as max_watermark FROM PCOLLECTION"));

但这会产生下面的错误。我不知道我在这里遗漏了什么。* 我完整的用例是从rowsPCollection<TableRow>)获取id列值的最大值,并将其存储在BigQuery表之一中。*

    • 错误**
[main] WARN org.apache.beam.sdk.io.jdbc.JdbcIO - Unable to infer a schema for type com.google.api.services.bigquery.model.TableRow. Attempting to infer a coder without a schema.
java.lang.IllegalStateException: Unable to return a default Coder for ParDo(Anonymous)/ParMultiDo(Anonymous).output [PCollection@185939155]. Correct one of the following root causes:
  No Coder has been manually specified;  you may do so using .setCoder().
  Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
  Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.sdk.util.Preconditions.checkStateNotNull(Preconditions.java:471)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:284)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:115)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:482)
        at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:360)
        at org.example.Main.main(Main.java:159)
jjjwad0x

jjjwad0x1#

结果PCollection<Row>中缺少.setCoder(RowCoder.of(schema))。它们也是在PCollection级别定义的。
有关https://beam.apache.org/documentation/programming-guide/#schemas详细信息,请访问www.example.com。

相关问题