如何修复hive&hadoop的“java.lang.runtimeexception:error caching map.xml”?

gopyfrb3  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(861)

我试图在配置单元表(apache-hive-3.1.1)中插入一些值,但缓存map.xml时出错showtables运行得很好,我可以在上面看到hadoop集群localhost:9870. 有人能告诉我如何修复这个错误吗?
我已经厌倦了格式化namenode并再次运行查询。我发现的其他选项是更改权限和删除hdfs数据。我都找不到该怎么做。
create表运行得非常好。

  1. create table employees
  2. (empid int,
  3. firstname varchar(30),
  4. lastname varchar(30),
  5. tenure int,
  6. address struct<street:string,city:string>,
  7. subordinates array<string>);

以下方法不起作用,

  1. insert into employees
  2. select 1, "Vitthal","Srinivasan",1, named_struct("street","Bellandur","city","Bangalore"),array("Anuradha","Arun","Swetha")
  3. union all select 2, "Swetha","Kolalapudi",4, named_struct("street","Bellandur","city","Bangalore"),array("Pradeep")
  4. union all select 3, "Janani","Ravi",2, named_struct("street","Bellandur","city","Bangalore"),array("Navdeep")
  5. union all select 4, "Navdeep","Singh",3, named_struct("street","Bellandur","city","Bangalore"),array("Shreya","Jitu");

下面是我收到的错误信息,

  1. hive> insert into employees
  2. > select 1, "Vitthal","Srinivasan",1, named_struct("street","Bellandur","city","Bangalore"),array("Anuradha","Arun","Swetha")
  3. > union all select 2, "Swetha","Kolalapudi",4, named_struct("street","Bellandur","city","Bangalore"),array("Pradeep")
  4. > union all select 3, "Janani","Ravi",2, named_struct("street","Bellandur","city","Bangalore"),array("Navdeep")
  5. > union all select 4, "Navdeep","Singh",3, named_struct("street","Bellandur","city","Bangalore"),array("Shreya","Jitu");
  6. Query ID = saurabhsomani_20190514204737_a73ab528-757c-4a7a-953a-4c5f1ce5ebcf
  7. Total jobs = 3
  8. Launching Job 1 out of 3
  9. Number of reduce tasks is set to 0 since there's no reduce operator
  10. java.lang.RuntimeException: Error caching map.xml
  11. at org.apache.hadoop.hive.ql.exec.Utilities.setBaseWork(Utilities.java:641)
  12. at org.apache.hadoop.hive.ql.exec.Utilities.setMapWork(Utilities.java:566)
  13. at org.apache.hadoop.hive.ql.exec.Utilities.setMapRedWork(Utilities.java:558)
  14. at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:362)
  15. at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:149)
  16. at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
  17. at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
  18. at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
  19. at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
  20. at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
  21. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
  22. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
  23. at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
  24. at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
  25. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
  26. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
  27. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
  28. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
  29. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
  30. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
  31. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  32. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  33. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  34. at java.lang.reflect.Method.invoke(Method.java:498)
  35. at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
  36. at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
  37. Caused by: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/hive/saurabhsomani/255fb86f-47de-4f31-ad0d-a64b8d864d37/hive_2019-05-14_20-47-37_654_6424247597470144451-1/-mr-10004/2fe7dee9-efc9-43a9-80c7-84f922243f7d/map.xml could only be written to 0 of the 1 minReplication nodes. There are 0 datanode(s) running and no node(s) are excluded in this operation.
  38. at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2117)
  39. at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:287)
  40. at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2691)
  41. at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:875)
  42. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:561)
  43. at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
  44. at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
  45. at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
  46. at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:872)
  47. at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:818)
  48. at java.security.AccessController.doPrivileged(Native Method)
  49. at javax.security.auth.Subject.doAs(Subject.java:422)
  50. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
  51. at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2678)
  52. at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
  53. at org.apache.hadoop.ipc.Client.call(Client.java:1443)
  54. at org.apache.hadoop.ipc.Client.call(Client.java:1353)
  55. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
  56. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
  57. at com.sun.proxy.$Proxy29.addBlock(Unknown Source)
  58. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:510)
  59. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  60. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  61. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  62. at java.lang.reflect.Method.invoke(Method.java:498)
  63. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
  64. at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
  65. at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
  66. at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
  67. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
  68. at com.sun.proxy.$Proxy30.addBlock(Unknown Source)
  69. at org.apache.hadoop.hdfs.DFSOutputStream.addBlock(DFSOutputStream.java:1078)
  70. at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1865)
  71. at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1668)
  72. at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
  73. Job Submission failed with exception 'java.lang.RuntimeException(Error caching map.xml)'
  74. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. Error caching map.xml

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题