Flink运行时出现ClassCastException:无法将java.util.LinkedHashMap的示例分配给字段org.apache.flink.runtime.jobgraph.JobVertex.results

kfgdxczn  于 2023-05-27  发布在  Apache
关注(0)|答案(1)|浏览(567)

运行Flink作业时出错:

  1. ClassCastException: cannot assign instance of java.util.LinkedHashMap to field org.apache.flink.runtime.jobgraph.JobVertex.results of type java.util.ArrayList in instance of org.apache.flink.runtime.jobgraph.InputOutputFormatVertex

下面是源代码

  1. ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
  2. ParameterTool params = ParameterTool.fromArgs(args);
  3. env.getConfig().setGlobalJobParameters(params);
  4. DataSet<String> text = env.readTextFile(params.get("input"));
  5. DataSet<String> filtered = text.filter(new FilterFunction<String>()
  6. {
  7. public boolean filter(String value)
  8. {
  9. return value.startsWith("N");
  10. }
  11. });
  12. DataSet<Tuple2<String, Integer>> tokenized = filtered.map(new Tokenizer());
  13. DataSet<Tuple2<String, Integer>> counts = tokenized.groupBy(new int[] { 0 }).sum(1);
  14. if (params.has("output"))
  15. {
  16. counts.writeAsText(params.get("output"));
  17. env.execute("WordCount Example");
  18. }
  19. }
  20. public static final class Tokenizer
  21. implements MapFunction<String, Tuple2<String, Integer>>
  22. {
  23. public Tuple2<String, Integer> map(String value)
  24. {
  25. return new Tuple2(value, Integer.valueOf(1));
  26. }
  27. }

错误:Error image

6jygbczu

6jygbczu1#

你是对的大卫安德森,这是我本地机器上的版本不匹配,我已经通过将本地Flink集群版本升级到最新版本来解决这个问题。

相关问题