我不能沉迷于使用ksql表的postgres
我创建了一个ksql表,其中包含一些来自流的聚合(源主题是avro)。我可以用select查看数据。我也可以直接把这个主题交给博士后。然而,我不能沉迷于ksql表的postgres。如何指定value.converter?
我创建了ksql表,如下所示:
CREATE TABLE some_table AS SELECT customer_name, COUNT(*) as cnt FROM some_stream GROUP BY customer_name;
我试着连接配置如下:
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
key.converter.schemas.enable=false
value.converter.schema.registry.url=http://localhost:8081
auto.evolve=true
tasks.max=1
topics=some_table
auto.create=true
value.converter=io.confluent.connect.avro.AvroConverter
connection.url=jdbc:postgresql://localhost:5432/mydb?user=postgres&password=postgres
key.converter=org.apache.kafka.connect.storage.StringConverter
错误是:
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:514)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:491)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:322)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:226)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:194)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.DataException: Failed to deserialize data for topic some_topic to Avro:
at io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:107)
我也试过:
{
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"key.converter.schemas.enable": "false",
"auto.evolve": "true",
"tasks.max": "1",
"topics": "some_topic",
"value.converter.schemas.enable": "false",
"auto.create": "true",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"connection.url": "jdbc:postgresql://localhost:5432/mydb?user=postgres&password=postgres",
"key.converter": "org.apache.kafka.connect.storage.StringConverter"
}
错误是:
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:514)
at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:491)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:322)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:226)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:194)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.DataException: Converting byte[] to Kafka Connect data failed due to serialization error:
那么如何使用jdbcsinkconnector接收ksql表呢?
1条答案
按热度按时间aij0ehis1#
检查您的模式注册表是否联机,并在以下位置设置正确的url:
value.converter.schema.registry.url
.