Flink 无法初始化类org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex

jjjwad0x  于 2023-09-28  发布在  Apache
关注(0)|答案(1)|浏览(146)

我试图使用Hudi没有Flink管道,以 parquet 格式将数据发布到S3对象存储。我在这样做时遇到了以下错误:

  1. java.lang.NoClassDefFoundError: Could not initialize class org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex
  2. at java.base/java.lang.Class.forName0(Native Method)
  3. at java.base/java.lang.Class.forName(Unknown Source)
  4. at org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:55)
  5. at org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:79)
  6. at org.apache.hudi.common.bootstrap.index.BootstrapIndex.getBootstrapIndex(BootstrapIndex.java:163)
  7. at org.apache.hudi.common.table.view.AbstractTableFileSystemView.init(AbstractTableFileSystemView.java:118)
  8. at org.apache.hudi.common.table.view.HoodieTableFileSystemView.init(HoodieTableFileSystemView.java:113)
  9. at org.apache.hudi.common.table.view.HoodieTableFileSystemView.<init>(HoodieTableFileSystemView.java:107)
  10. at org.apache.hudi.common.table.view.FileSystemViewManager.createInMemoryFileSystemView(FileSystemViewManager.java:177)
  11. at org.apache.hudi.common.table.view.FileSystemViewManager.lambda$createViewManager$5fcdabfe$1(FileSystemViewManager.java:272)
  12. at org.apache.hudi.common.table.view.FileSystemViewManager.lambda$getFileSystemView$1(FileSystemViewManager.java:115)
  13. at java.base/java.util.concurrent.ConcurrentHashMap.computeIfAbsent(Unknown Source)
  14. at org.apache.hudi.common.table.view.FileSystemViewManager.getFileSystemView(FileSystemViewManager.java:114)
  15. at org.apache.hudi.table.HoodieTable.getSliceView(HoodieTable.java:320)
  16. at org.apache.hudi.sink.partitioner.profile.DeltaWriteProfile.getFileSystemView(DeltaWriteProfile.java:93)
  17. at org.apache.hudi.sink.partitioner.profile.WriteProfile.<init>(WriteProfile.java:120)
  18. at org.apache.hudi.sink.partitioner.profile.DeltaWriteProfile.<init>(DeltaWriteProfile.java:44)
  19. at org.apache.hudi.sink.partitioner.profile.WriteProfiles.getWriteProfile(WriteProfiles.java:75)
  20. at org.apache.hudi.sink.partitioner.profile.WriteProfiles.lambda$singleton$0(WriteProfiles.java:64)
  21. at java.base/java.util.HashMap.computeIfAbsent(Unknown Source)
  22. at org.apache.hudi.sink.partitioner.profile.WriteProfiles.singleton(WriteProfiles.java:63)
  23. at org.apache.hudi.sink.partitioner.BucketAssigners.create(BucketAssigners.java:56)
  24. at org.apache.hudi.sink.partitioner.BucketAssignFunction.open(BucketAssignFunction.java:122)
  25. at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34)
  26. at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:101)
  27. at org.apache.flink.streaming.api.operators.KeyedProcessOperator.open(KeyedProcessOperator.java:55)
  28. at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.initializeStateAndOpenOperators(RegularOperatorChain.java:107)
  29. at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:734)
  30. at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55)
  31. at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreInternal(StreamTask.java:709)
  32. at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:675)
  33. at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:952)
  34. at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:921)
  35. at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:745)
  36. at org.apache.flink.runtime.taskmanager.Task.run(Task.java:562)
  37. at java.base/java.lang.Thread.run(Unknown Source)
  38. Suppressed: java.lang.NullPointerException
  39. at org.apache.hudi.sink.partitioner.BucketAssignFunction.close(BucketAssignFunction.java:247)
  40. at org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:41)
  41. at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.close(AbstractUdfStreamOperator.java:115)
  42. at org.apache.flink.streaming.runtime.tasks.StreamOperatorWrapper.close(StreamOperatorWrapper.java:163)
  43. at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.closeAllOperators(RegularOperatorChain.java:125)
  44. at org.apache.flink.streaming.runtime.tasks.StreamTask.closeAllOperators(StreamTask.java:1043)
  45. at org.apache.flink.util.IOUtils.closeAll(IOUtils.java:255)
  46. at org.apache.flink.core.fs.AutoCloseableRegistry.doClose(AutoCloseableRegistry.java:72)
  47. at org.apache.flink.util.AbstractAutoCloseableRegistry.close(AbstractAutoCloseableRegistry.java:127)
  48. at org.apache.flink.streaming.runtime.tasks.StreamTask.cleanUp(StreamTask.java:951)
  49. at org.apache.flink.runtime.taskmanager.Task.lambda$restoreAndInvoke$0(Task.java:934)
  50. at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:952)
  51. at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:934)

以下是创建的hoodie.properties:

  1. hoodie.compaction.payload.class=org.apache.hudi.common.model.EventTimeAvroPayload
  2. hoodie.table.type=MERGE_ON_READ
  3. hoodie.table.partition.fields=eventId,eventName
  4. hoodie.table.cdc.enabled=false
  5. hoodie.archivelog.folder=archived
  6. hoodie.timeline.layout.version=1
  7. hoodie.table.checksum=3942898242
  8. hoodie.datasource.write.drop.partition.columns=false
  9. hoodie.table.recordkey.fields=event_id,event_timestamp
  10. hoodie.table.name=sink_table
  11. hoodie.compaction.record.merger.strategy=eeb8d96f-b1e4-49fd-bbf8-28ac514178e5
  12. hoodie.datasource.write.hive_style_partitioning=false
  13. hoodie.table.keygenerator.class=org.apache.hudi.keygen.ComplexAvroKeyGenerator
  14. hoodie.datasource.write.partitionpath.urlencode=false
  15. hoodie.table.version=5

我尝试在类路径中使用hudi-flink 1.16-bundle-0.13.1,但这只会产生很少的NoClassDefFound错误。所以独立添加了所需的库。有没有人能帮帮我,我错过了什么?蒂娅
Flink版本:1.17.1虎地版本:0.13.1

zd287kbt

zd287kbt1#

我通过将hudi-flink-bundle添加到插件flink_dir/plugins/hudi/<hudi-flink-jar>来解决这个问题。我把它从图书馆里拿走了。

相关问题