kafka生产者无法验证没有pk的记录并返回invalidrecordexception

6ioyuze2  于 2021-06-04  发布在  Kafka
关注(0)|答案(1)|浏览(482)

我的Kafka制作人出了问题。我使用debezium kafka连接器v1.1.0 final和kafka 2.4.1。对于带有pk的表,所有表都被清除,但不幸的是,对于没有pk的表,它给了我这个错误:

[2020-04-14 10:00:00,096] INFO   Exporting data from table 'public.table_0' (io.debezium.relational.RelationalSnapshotChangeEventSource:280)
[2020-04-14 10:00:00,097] INFO   For table 'public.table_0' using select statement: 'SELECT * FROM "public"."table_0"' (io.debezium.relational.RelationalSnapshotChangeEventSource:287)
[2020-04-14 10:00:00,519] INFO   Finished exporting 296 records for table 'public.table_0'; total duration '00:00:00.421' (io.debezium.relational.RelationalSnapshotChangeEventSource:330)
[2020-04-14 10:00:00,522] INFO Snapshot - Final stage (io.debezium.pipeline.source.AbstractSnapshotChangeEventSource:79)
[2020-04-14 10:00:00,523] INFO Snapshot ended with SnapshotResult [status=COMPLETED, offset=PostgresOffsetContext [sourceInfo=source_info[server='postgres'db='xxx, lsn=38/C74913C0, txId=4511542, timestamp=2020-04-14T02:00:00.517Z, snapshot=FALSE, schema=public, table=table_0], partition={server=postgres}, lastSnapshotRecord=true]] (io.debezium.pipeline.ChangeEventSourceCoordinator:90)
[2020-04-14 10:00:00,524] INFO Connected metrics set to 'true' (io.debezium.pipeline.metrics.StreamingChangeEventSourceMetrics:59)
[2020-04-14 10:00:00,526] INFO Starting streaming (io.debezium.pipeline.ChangeEventSourceCoordinator:100)
[2020-04-14 10:00:00,550] ERROR WorkerSourceTask{id=pg_dev_pinjammodal-0} failed to send record to table_0: (org.apache.kafka.connect.runtime.WorkerSourceTask:347)
org.apache.kafka.common.InvalidRecordException: This record has failed the validation on broker and hence be rejected.

我查了一下表,似乎是有效的记录。我把我的制作人 producer.ack=1 在我的配置中。这是无效的吗?

e3bfsja2

e3bfsja21#

我已经检查了日志,我的错误是设置kafka主题与日志压缩非pk表需要密钥。消息没有密钥,因为表没有pk,这使得代理无法验证kafka消息。所以,如果你的桌面上没有pk,并且想把它推给kafka,那么不要将日志压缩设置为你的主题。

相关问题