flink中的检查点

htzpubme  于 2021-06-24  发布在  Flink
关注(0)|答案(0)|浏览(368)

尝试在flink和fs hdfs中使用检查点机制。连接时hdfs://aleksandar/0.0.0.0:50010/shared/i出现以下错误

Caused by: java.lang.IllegalArgumentException: Pathname /0.0.0.0:50010/shared/972dde22148f58ec9f266fb7bdfae891 from hdfs://aleksandar/0.0.0.0:50010/shared/972dde22148f58ec9f266fb7bdfae891 is not a valid DFS filename.

在核心站点设置中,我有以下配置

<configuration>
  <property>
    <name>hadoop.tmp.dir</name>
    <value>/var/lib/hadoop</value>
  </property>
  <property>
    <name>fs.defaultFS</name>
    <value>hdfs://0.0.0.0:123</value>
  </property>
  <property>
     <name>dfs.webhdfs.enabled</name>
     <value>true</value>
  </property>
</configuration>

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题