“java.lang.verifyerror:在使用flink whith iceberg运行时和配置单元的sql客户端时,堆栈Map与异常处理程序70处的Map不匹配”

r1zhe5dt  于 2021-07-13  发布在  Hive
关注(0)|答案(0)|浏览(716)

根据https://iceberg.apache.org/flink/ ,我使用带有选项的flink的sql客户机 -j : bin/sql-client.sh embedded -j lib/flink-sql-connector-hive-2.3.6_2.11-1.11.3.jar -j lib/iceberg-flink-runtime-0.11.0.jar shell 并遇到以下例外情况:

  1. Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
  2. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
  3. Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
  4. at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870)
  5. at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227)
  6. at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
  7. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
  8. Caused by: java.lang.VerifyError: Stack map does not match the one at exception handler 70
  9. Exception Details:
  10. Location:
  11. org/apache/iceberg/hive/HiveCatalog.loadNamespaceMetadata(Lorg/apache/iceberg/catalog/Namespace;)Ljava/util/Map; @70: astore_2
  12. Reason:
  13. Type 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' (current frame, stack[0]) is not assignable to 'org/apache/thrift/TException' (stack map, stack[0])
  14. Current Frame:
  15. bci: @27
  16. flags: { }
  17. locals: { 'org/apache/iceberg/hive/HiveCatalog', 'org/apache/iceberg/catalog/Namespace' }
  18. stack: { 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' }
  19. Stackmap Frame:
  20. bci: @70
  21. flags: { }
  22. locals: { 'org/apache/iceberg/hive/HiveCatalog', 'org/apache/iceberg/catalog/Namespace' }
  23. stack: { 'org/apache/thrift/TException' }
  24. Bytecode:
  25. 0x0000000: 2a2b b700 c59a 0016 bb01 2c59 1301 2e04
  26. 0x0000010: bd01 3059 032b 53b7 0133 bf2a b400 3e2b
  27. 0x0000020: ba02 8e00 00b6 00e8 c002 904d 2a2c b702
  28. 0x0000030: 944e b201 2213 0296 2b2d b902 5d01 00b9
  29. 0x0000040: 012a 0400 2db0 4dbb 012c 592c 1301 2e04
  30. 0x0000050: bd01 3059 032b 53b7 0281 bf4d bb01 3559
  31. 0x0000060: bb01 3759 b701 3813 0283 b601 3e2b b601
  32. 0x0000070: 4113 0208 b601 3eb6 0144 2cb7 0147 bf4d
  33. 0x0000080: b800 46b6 014a bb01 3559 bb01 3759 b701
  34. 0x0000090: 3813 0285 b601 3e2b b601 4113 0208 b601
  35. 0x00000a0: 3eb6 0144 2cb7 0147 bf
  36. Exception Handler Table:
  37. bci [27, 69] => handler: 70
  38. bci [27, 69] => handler: 70
  39. bci [27, 69] => handler: 91
  40. bci [27, 69] => handler: 127
  41. Stackmap Table:
  42. same_frame(@27)
  43. same_locals_1_stack_item_frame(@70,Object[#191])
  44. same_locals_1_stack_item_frame(@91,Object[#191])
  45. same_locals_1_stack_item_frame(@127,Object[#193])
  46. at org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:112)
  47. at org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:111)
  48. at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:127)
  49. at org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:117)
  50. at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:378)
  51. at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:626)
  52. at java.util.HashMap.forEach(HashMap.java:1289)
  53. at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625)
  54. at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264)
  55. at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624)
  56. at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523)
  57. at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:183)
  58. at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:136)
  59. at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859)
  60. ... 3 more

这似乎是错误的东西是jar版本。
在尝试了不同的flink版本的Hive版本后,还是得到了同样的问题。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题