hadoop

mm5n2pyu  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(255)

我有一个genericudf(见下面的代码),它在hadoop-1和hive-0.12上运行良好。但是当使用hive-0.13+hadoop-2测试相同的genericudf时,我得到了以下错误。
vertex失败,vertexname=map 12,vertexid=vertex\u 1409698731658\u 42202\u 1\u 00,diagnostics=[顶点输入:ccv初始值设定项失败,org.apache.hive.com.esotericsoftware.kry o.kryoexception:找不到类:com...id1
这是我的自定义项代码。

  1. package com.xxx.xxx;
  2. import org.apache.hadoop.hive.*;
  3. public class Id1 extends GenericUDF {
  4. private MapredContext context;
  5. private long sequenceNum = 0;
  6. private static final int padLength = 10;
  7. StringBuilder sb = null;
  8. public ObjectInspector initialize(ObjectInspector[] arguments)
  9. throws UDFArgumentException {
  10. sequenceNum = 0;
  11. sb = new StringBuilder();
  12. return PrimitiveObjectInspectorFactory.javaStringObjectInspector;
  13. }
  14. public Object evaluate(DeferredObject[] arguments) throws HiveException {
  15. int sbLength = sb.toString().length();
  16. if (sbLength > 0)
  17. sb.replace(0, sbLength, "");
  18. String taskId = null;
  19. if (context.getJobConf() != null)
  20. taskId = context.getJobConf().get("mapred.taskid");
  21. sequenceNum++;
  22. if (taskId != null) {
  23. sb.append(taskId.replace("attempt_", ""));
  24. }
  25. int i = 0;
  26. String seqStr = String.valueOf(sequenceNum);
  27. sb.append(seqStr);
  28. return sb.toString();
  29. }
  30. public String getDisplayString(String[] children) {
  31. return "id1()";
  32. }
  33. @Override
  34. public void configure(MapredContext context) {
  35. this.context = context;
  36. }
  37. }

我确信这与hive-0.13有关,但看不到任何与此错误相关的帖子。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题