Integrate big SQL Server tables with Airbyte

5q4ezhmt  于 2023-03-22  发布在  SQL Server
关注(0)|答案(1)|浏览(152)

I built Airbyte on a public GKE cluster to integrate tables in SQL Server 2017 to BigQuery.

This is what I configured for the SQL Server source:

  • Replication Method -> Logical Replication (CDC)
  • Data to Synk -> Existing and new
  • Initial Snapshot Isolation Level -> Read Committed
  • Initial Waiting Time in Seconds -> 300

I have created a connection to integrate all tables in specified database but got the following error message.

2023-03-18 14:17:13 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to get state
2023-03-18 14:17:13 INFO i.a.a.c.AirbyteApiClient(retryWithJitter):172 - Attempt 0 to set attempt sync config
2023-03-18 14:18:14 WARN i.t.i.w.ActivityWorker$TaskHandlerImpl(logExceptionDuringResultReporting):365 - Failure during reporting of activity result to the server. ActivityId = 16725bd6-efd4-3750-a7a8-6b1fac6278a8, ActivityType = GetSyncWorkflowInputWithAttemptNumber, WorkflowId=connection_manager_a9d2fce3-40b9-4a46-8356-82281c2874d7, WorkflowType=ConnectionManagerWorkflow, RunId=5e2b7511-b195-43a1-8f37-9e0b310c83f0
io.grpc.StatusRuntimeException: RESOURCE_EXHAUSTED: grpc: received message larger than max (6271349 vs. 4194304)
    at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:271) ~[grpc-stub-1.52.1.jar:1.52.1]
    at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:252) ~[grpc-stub-1.52.1.jar:1.52.1]
    at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:165) ~[grpc-stub-1.52.1.jar:1.52.1]
    at io.temporal.api.workflowservice.v1.WorkflowServiceGrpc$WorkflowServiceBlockingStub.respondActivityTaskCompleted(WorkflowServiceGrpc.java:3840) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.lambda$sendReply$0(ActivityWorker.java:303) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.lambda$retry$0(GrpcRetryer.java:52) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcSyncRetryer.retry(GrpcSyncRetryer.java:67) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.retryWithResult(GrpcRetryer.java:60) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.retryer.GrpcRetryer.retry(GrpcRetryer.java:50) ~[temporal-serviceclient-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.sendReply(ActivityWorker.java:298) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handleActivity(ActivityWorker.java:252) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:206) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.ActivityWorker$TaskHandlerImpl.handle(ActivityWorker.java:179) ~[temporal-sdk-1.17.0.jar:?]
    at io.temporal.internal.worker.PollTaskExecutor.lambda$process$0(PollTaskExecutor.java:93) ~[temporal-sdk-1.17.0.jar:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    at java.lang.Thread.run(Thread.java:1589) ~[?:?]

Anyone know what happen and how to fix it?

Also I tried integrating only one table, and found that small table can be integrated to BigQuery successfully but big table, about 1GB, got failed.

  • Error message shown on connection list page like this:
Sync Failed
Last attempt:359.96 MB175,642 emitted recordsno records8m 29s
Failure Origin: source, Message: Something went wrong in the connector. See the logs for more details.

Following is partial detail message:

2023-03-19 07:07:27 [42mnormalization[0m > Running: transform-catalog --integration-type bigquery --profile-config-dir /config --catalog destination_catalog.json --out /config/models/generated/ --json-column _airbyte_data
2023-03-19 07:07:27 [42mnormalization[0m > Processing destination_catalog.json...
2023-03-19 07:07:27 [42mnormalization[0m >   Generating airbyte_ctes/ods_ERP/ACRTB_ab1.sql from ACRTB
2023-03-19 07:07:27 [42mnormalization[0m >   Generating airbyte_ctes/ods_ERP/ACRTB_ab2.sql from ACRTB
2023-03-19 07:07:27 [42mnormalization[0m >   Generating airbyte_views/ods_ERP/ACRTB_stg.sql from ACRTB
2023-03-19 07:07:27 [42mnormalization[0m >   Generating airbyte_incremental/scd/ods_ERP/ACRTB_scd.sql from ACRTB
2023-03-19 07:07:27 [42mnormalization[0m >   Generating airbyte_incremental/ods_ERP/ACRTB.sql from ACRTB
2023-03-19 07:07:27 [42mnormalization[0m > detected no config file for ssh, assuming ssh is off.
2023-03-19 07:07:30 [42mnormalization[0m >            [--event-buffer-size EVENT_BUFFER_SIZE]
2023-03-19 07:07:30 [42mnormalization[0m >   --event-buffer-size EVENT_BUFFER_SIZE
2023-03-19 07:07:30 [42mnormalization[0m > DBT >=1.0.0 detected; using 10K event buffer size
2023-03-19 07:07:34 [42mnormalization[0m > Running with dbt=1.0.0
2023-03-19 07:07:34 [42mnormalization[0m > Partial parse save file not found. Starting full parse.
2023-03-19 07:07:36 [42mnormalization[0m > [[33mWARNING[0m]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 1 unused configuration paths:
- models.airbyte_utils.generated.airbyte_tables

2023-03-19 07:07:36 [42mnormalization[0m > Found 5 models, 0 tests, 0 snapshots, 0 analyses, 624 macros, 0 operations, 0 seed files, 1 source, 0 exposures, 0 metrics
2023-03-19 07:07:38 [42mnormalization[0m > Concurrency: 8 threads (target='prod')
2023-03-19 07:07:38 [42mnormalization[0m > 1 of 3 START view model _airbyte_ods_ERP.ACRTB_stg...................................................................... [RUN]
2023-03-19 07:07:39 [42mnormalization[0m > 1 of 3 OK created view model _airbyte_ods_ERP.ACRTB_stg................................................................. [[32mOK[0m in 0.96s]
2023-03-19 07:07:39 [42mnormalization[0m > 2 of 3 START incremental model ods_ERP.ACRTB_scd........................................................................ [RUN]
2023-03-19 07:07:19 [1;31mERROR[m i.a.w.i.DefaultAirbyteStreamFactory(validate):134 - Validation failed: null
2023-03-19 07:07:20 [32mINFO[m i.a.w.p.KubePodProcess(close):725 - (pod: default / destination-bigquery-write-13-2-rjtle) - Closed all resources for pod
2023-03-19 07:07:20 [1;31mERROR[m i.a.w.g.DefaultReplicationWorker(replicate):259 - Sync worker failed.
java.util.concurrent.ExecutionException: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!
    at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396) ~[?:?]
    at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073) ~[?:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:251) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:175) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:91) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?]
    at java.lang.Thread.run(Thread.java:1589) ~[?:?]
    Suppressed: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
        at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:162) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.replicate(DefaultReplicationWorker.java:196) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:175) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
        at io.airbyte.workers.general.DefaultReplicationWorker.run(DefaultReplicationWorker.java:91) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
        at io.airbyte.workers.temporal.TemporalAttemptExecution.lambda$getWorkerThread$5(TemporalAttemptExecution.java:195) ~[io.airbyte-airbyte-workers-0.41.0.jar:?]
        at java.lang.Thread.run(Thread.java:1589) ~[?:?]
Caused by: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:392) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    ... 1 more
Caused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.
    at io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:162) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:390) ~[io.airbyte-airbyte-commons-worker-0.41.0.jar:?]
    at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
    ... 1 more
2023-03-19 07:07:20 [32mINFO[m i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):563 - Source did not output any state messages
2023-03-19 07:07:20 [33mWARN[m i.a.w.g.DefaultReplicationWorker(prepStateForLaterSaving):574 - State capture: No state retained.
2023-03-19 07:07:20 [32mINFO[m i.a.w.g.DefaultReplicationWorker(getReplicationOutput):494 - sync summary: {
  "status" : "failed",
  "recordsSynced" : 31229,
  "bytesSynced" : 43608448,
  "startTime" : 1679209530317,
  "endTime" : 1679209640938,
  "totalStats" : {
    "bytesEmitted" : 43608448,
    "destinationStateMessagesEmitted" : 0,
    "destinationWriteEndTime" : 1679209640843,
    "destinationWriteStartTime" : 1679209541726,
    "meanSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBeforeSourceStateMessageEmitted" : 0,
    "maxSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "meanSecondsBetweenStateMessageEmittedandCommitted" : 0,
    "recordsEmitted" : 31229,
    "recordsCommitted" : 0,
    "replicationEndTime" : 1679209640938,
    "replicationStartTime" : 1679209530317,
    "sourceReadEndTime" : 0,
    "sourceReadStartTime" : 1679209535611,
    "sourceStateMessagesEmitted" : 0
  },
  "streamStats" : [ {
    "streamName" : "ACRTB",
    "stats" : {
      "bytesEmitted" : 43608448,
      "recordsEmitted" : 31229
    }
  } ]
}
2023-03-19 07:07:20 [32mINFO[m i.a.w.g.DefaultReplicationWorker(getReplicationOutput):495 - failures: [ {
  "failureOrigin" : "source",
  "failureType" : "system_error",
  "internalMessage" : "java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block",
  "externalMessage" : "Something went wrong in the connector. See the logs for more details.",
  "metadata" : {
    "attemptNumber" : 2,
    "jobId" : 13,
    "from_trace_message" : true,
    "connector_command" : "read"
  },
  "stacktrace" : "java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block\n\tat io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:163)\n\tat io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:27)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)\n\tat com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)\n\tat com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)\n\tat java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)\n\tat io.airbyte.integrations.base.IntegrationRunner.lambda$produceMessages$0(IntegrationRunner.java:187)\n\tat io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:237)\n\tat io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:186)\n\tat io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:139)\n\tat io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98)\n\tat io.airbyte.integrations.source.mssql.MssqlSource.main(MssqlSource.java:530)\nCaused by: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block\n\tat io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:137)\n\tat io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.read(AirbyteFileOffsetBackingStore.java:57)\n\tat io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:148)\n\t... 22 more\nCaused by: java.io.StreamCorruptedException: unexpected EOF in middle of data block\n\tat java.base/java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:3155)\n\tat java.base/java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:3231)\n\tat java.base/java.io.DataInputStream.readInt(DataInputStream.java:393)\n\tat java.base/java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:3436)\n\tat java.base/java.io.ObjectInputStream.readInt(ObjectInputStream.java:1128)\n\tat java.base/java.util.HashMap.readObject(HashMap.java:1523)\n\tat java.base/jdk.internal.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)\n\tat java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.base/java.lang.reflect.Method.invoke(Method.java:568)\n\tat java.base/java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1100)\n\tat java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2423)\n\tat java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)\n\tat java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)\n\tat java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)\n\tat java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)\n\tat io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:120)\n\t... 24 more\n",
  "timestamp" : 1679209627560
}, {
  "failureOrigin" : "source",
  "internalMessage" : "io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!",
  "externalMessage" : "Something went wrong within the source connector",
  "metadata" : {
    "attemptNumber" : 2,
    "jobId" : 13,
    "connector_command" : "read"
  },
  "stacktrace" : "java.util.concurrent.CompletionException: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!\n\tat java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)\n\tat java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1807)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)\n\tat java.base/java.lang.Thread.run(Thread.java:1589)\nCaused by: io.airbyte.workers.internal.exception.SourceException: Source cannot be stopped!\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:392)\n\tat java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)\n\t... 3 more\nCaused by: io.airbyte.workers.exception.WorkerException: Source process exit with code 1. This warning is normal if the job was cancelled.\n\tat io.airbyte.workers.internal.DefaultAirbyteSource.close(DefaultAirbyteSource.java:162)\n\tat io.airbyte.workers.general.DefaultReplicationWorker.lambda$readFromSrcAndWriteToDstRunnable$7(DefaultReplicationWorker.java:390)\n\t... 4 more\n",
  "timestamp" : 1679209628767
} ]

Does anyone else have this problem?

Or how can I configure to integrate a big table?

xdnvmnnf

xdnvmnnf1#

I got almost the same error, two days ago and after one of the source tables got a vast data migration.

2023-03-19 11:03:06 [44msource[0m > ERROR i.a.i.b.AirbyteExceptionHandler(uncaughtException):26 Something went wrong in the connector. See the logs for more details. java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:163) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:27) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) ~[guava-31.1-jre.jar:?]
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) ~[guava-31.1-jre.jar:?]
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) ~[io.airbyte-airbyte-commons-0.42.0.jar:?]
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) ~[guava-31.1-jre.jar:?]
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) ~[guava-31.1-jre.jar:?]
    at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63) ~[io.airbyte-airbyte-commons-0.42.0.jar:?]
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) ~[guava-31.1-jre.jar:?]
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) ~[guava-31.1-jre.jar:?]
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) ~[io.airbyte-airbyte-commons-0.42.0.jar:?]
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) ~[guava-31.1-jre.jar:?]
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) ~[guava-31.1-jre.jar:?]
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38) ~[io.airbyte-airbyte-commons-0.42.0.jar:?]
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146) ~[guava-31.1-jre.jar:?]
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141) ~[guava-31.1-jre.jar:?]
    at java.util.Iterator.forEachRemaining(Iterator.java:132) ~[?:?]
    at io.airbyte.integrations.base.IntegrationRunner.lambda$produceMessages$0(IntegrationRunner.java:187) ~[io.airbyte.airbyte-integrations.bases-base-java-0.42.0.jar:?]
    at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:237) ~[io.airbyte.airbyte-integrations.bases-base-java-0.42.0.jar:?]
    at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:186) ~[io.airbyte.airbyte-integrations.bases-base-java-0.42.0.jar:?]
    at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:139) ~[io.airbyte.airbyte-integrations.bases-base-java-0.42.0.jar:?]
    at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98) ~[io.airbyte.airbyte-integrations.bases-base-java-0.42.0.jar:?]
    at io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:400) ~[io.airbyte.airbyte-integrations.connectors-source-mysql-0.42.0.jar:?]
Caused by: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:137) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.read(AirbyteFileOffsetBackingStore.java:57) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:148) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    ... 22 more
Caused by: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:3155) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:3231) ~[?:?]
    at java.io.DataInputStream.readInt(DataInputStream.java:393) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:3436) ~[?:?]
    at java.io.ObjectInputStream.readInt(ObjectInputStream.java:1128) ~[?:?]
    at java.util.HashMap.readObject(HashMap.java:1523) ~[?:?]
    at jdk.internal.reflect.GeneratedMethodAccessor31.invoke(Unknown Source) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1100) ~[?:?]
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2423) ~[?:?]
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:120) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.read(AirbyteFileOffsetBackingStore.java:57) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:148) ~[io.airbyte.airbyte-integrations.bases-debezium-0.42.0.jar:?]
    ... 22 more

Stack Trace: java.lang.RuntimeException: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:163)
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:27)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
    at io.airbyte.integrations.base.IntegrationRunner.lambda$produceMessages$0(IntegrationRunner.java:187)
    at io.airbyte.integrations.base.IntegrationRunner.watchForOrphanThreads(IntegrationRunner.java:237)
    at io.airbyte.integrations.base.IntegrationRunner.produceMessages(IntegrationRunner.java:186)
    at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:139)
    at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:98)
    at io.airbyte.integrations.source.mysql.MySqlSource.main(MySqlSource.java:400)
Caused by: org.apache.kafka.connect.errors.ConnectException: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:137)
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.read(AirbyteFileOffsetBackingStore.java:57)
    at io.airbyte.integrations.debezium.internals.DebeziumStateDecoratingIterator.computeNext(DebeziumStateDecoratingIterator.java:148)
    ... 22 more
Caused by: java.io.StreamCorruptedException: unexpected EOF in middle of data block
    at java.base/java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:3155)
    at java.base/java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:3231)
    at java.base/java.io.DataInputStream.readInt(DataInputStream.java:393)
    at java.base/java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:3436)
    at java.base/java.io.ObjectInputStream.readInt(ObjectInputStream.java:1128)
    at java.base/java.util.HashMap.readObject(HashMap.java:1523)
    at java.base/jdk.internal.reflect.GeneratedMethodAccessor31.invoke(Unknown Source)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:568)
    at java.base/java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1100)
    at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2423)
    at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257)
    at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733)
    at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:509)
    at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:467)
    at io.airbyte.integrations.debezium.internals.AirbyteFileOffsetBackingStore.load(AirbyteFileOffsetBackingStore.java:120)
    ... 24 more

Anyone who can point me in any direction? Running the latest Airbyte version on a VM with Docker.

Maybe worth checking this Issue on Github: https://github.com/airbytehq/airbyte/issues/24191

相关问题