提交flink作业时不兼容的类版本

34gzjxbg  于 2021-08-20  发布在  Java
关注(0)|答案(1)|浏览(537)

我正在尝试使用beam 2.27/flink 1.12提交流媒体作业,使用以下maven命令行:

  1. mvn exec:java -Dexec.mainClass=org.example.MyPipelineClass -Pflink-runner -Dexec.args="--runner=FlinkRunner --flinkMaster=flink-host:8081 --filesToStage=target/pipelines-bundled-1.0.0.jar"

有一段时间一切都很顺利,但我最近做了一些编辑,希望运行管道的新版本,现在出现以下错误:

  1. Caused by: java.io.InvalidClassException: org.apache.flink.streaming.api.graph.StreamConfig$NetworkInputConfig; local class incompatible: stream classdesc serialVersionUID = -3137689219135046939, local class serialVersionUID = 3698633776553163849
  2. at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699)
  3. at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2003)
  4. at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1850)
  5. at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2160)
  6. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)
  7. at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2093)
  8. at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1655)
  9. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
  10. at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
  11. at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
  12. at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
  13. at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
  14. at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
  15. at org.apache.flink.streaming.api.graph.StreamConfig.getInputs(StreamConfig.java:263)
  16. ... 21 more

my pom.xml的外观如下所示:

  1. <?xml version="1.0" encoding="UTF-8"?>
  2. <project xmlns="http://maven.apache.org/POM/4.0.0"
  3. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  4. xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  5. <modelVersion>4.0.0</modelVersion>
  6. <groupId>org.example</groupId>
  7. <artifactId>pipelines</artifactId>
  8. <version>1.0.0</version>
  9. <packaging>jar</packaging>
  10. <properties>
  11. <beam.version>2.27.0</beam.version>
  12. <flink.version>1.12</flink.version>
  13. <maven-compiler-plugin.version>3.7.0</maven-compiler-plugin.version>
  14. <maven-exec-plugin.version>1.6.0</maven-exec-plugin.version>
  15. <maven-jar-plugin.version>3.2.0</maven-jar-plugin.version>
  16. <maven-shade-plugin.version>3.1.0</maven-shade-plugin.version>
  17. </properties>
  18. <build>
  19. <plugins>
  20. <plugin>
  21. <groupId>org.apache.maven.plugins</groupId>
  22. <artifactId>maven-compiler-plugin</artifactId>
  23. <version>${maven-compiler-plugin.version}</version>
  24. <configuration>
  25. <source>1.8</source>
  26. <target>1.8</target>
  27. </configuration>
  28. </plugin>
  29. <plugin>
  30. <groupId>org.apache.maven.plugins</groupId>
  31. <artifactId>maven-jar-plugin</artifactId>
  32. <version>${maven-jar-plugin.version}</version>
  33. </plugin>
  34. <plugin>
  35. <groupId>org.apache.maven.plugins</groupId>
  36. <artifactId>maven-shade-plugin</artifactId>
  37. <version>${maven-shade-plugin.version}</version>
  38. <executions>
  39. <execution>
  40. <phase>package</phase>
  41. <goals>
  42. <goal>shade</goal>
  43. </goals>
  44. <configuration>
  45. <finalName>${project.artifactId}-bundled-${project.version}</finalName>
  46. <filters>
  47. <filter>
  48. <artifact>*:*</artifact>
  49. <excludes>
  50. <exclude>META-INF/LICENSE</exclude>
  51. <exclude>META-INF/*.SF</exclude>
  52. <exclude>META-INF/*.DSA</exclude>
  53. <exclude>META-INF/*.RSA</exclude>
  54. </excludes>
  55. </filter>
  56. </filters>
  57. <transformers>
  58. <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
  59. </transformers>
  60. </configuration>
  61. </execution>
  62. </executions>
  63. </plugin>
  64. </plugins>
  65. <pluginManagement>
  66. <plugins>
  67. <plugin>
  68. <groupId>org.codehaus.mojo</groupId>
  69. <artifactId>exec-maven-plugin</artifactId>
  70. <version>${maven-exec-plugin.version}</version>
  71. <configuration>
  72. <cleanupDaemonThreads>false</cleanupDaemonThreads>
  73. </configuration>
  74. </plugin>
  75. </plugins>
  76. </pluginManagement>
  77. </build>
  78. <profiles>
  79. <profile>
  80. <id>flink-runner</id>
  81. <dependencies>
  82. <dependency>
  83. <groupId>org.apache.beam</groupId>
  84. <artifactId>beam-runners-flink-${flink.version}</artifactId>
  85. <version>${beam.version}</version>
  86. <scope>runtime</scope>
  87. </dependency>
  88. </dependencies>
  89. </profile>
  90. </profiles>
  91. <dependencies>
  92. <dependency>
  93. <groupId>org.apache.beam</groupId>
  94. <artifactId>beam-sdks-java-core</artifactId>
  95. <version>${beam.version}</version>
  96. </dependency>
  97. <!-- note that the pipeline runs on GCP -->
  98. <dependency>
  99. <groupId>org.apache.beam</groupId>
  100. <artifactId>beam-sdks-java-io-google-cloud-platform</artifactId>
  101. <version>${beam.version}</version>
  102. <exclusions>
  103. <exclusion>
  104. <groupId>com.google.cloud.bigtable</groupId>
  105. <artifactId>bigtable-client-core</artifactId>
  106. </exclusion>
  107. </exclusions>
  108. </dependency>
  109. <!-- main output is JDBC/Postgres -->
  110. <dependency>
  111. <groupId>org.apache.beam</groupId>
  112. <artifactId>beam-sdks-java-io-jdbc</artifactId>
  113. <version>${beam.version}</version>
  114. </dependency>
  115. <dependency>
  116. <groupId>org.postgresql</groupId>
  117. <artifactId>postgresql</artifactId>
  118. <version>42.2.4</version>
  119. </dependency>
  120. </dependencies>
  121. </project>

当然,在提交作业之前,我使用 mvn clean package -Pflink-runner .
flink自托管在gke示例上,并使用 flink:1.12.4-java8 形象。
我发现这个问题似乎有相同的错误信息,但没有明确的违规者。
如果有任何需要探索的帮助或想法,我将不胜感激,我不知道该调查什么。

bttbmeg0

bttbmeg01#

我发现这是我自己的问题。在 pom.xml 在上面,我指定梁流道为 beam-runners-flink-1.12 ,群集运行的是flink 1.12.4
梁的flink runner未指定面片,而是使用flink 1.12.0
降级集群解决了我的问题。

相关问题