无法从本地计算机连接hdfs

x3naxklr  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(488)

我正在写一个简单的程序来从hdfs读/写数据。我无法从本地计算机连接安装在远程计算机中的hdfs。我得到以下例外

  1. 18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time)
  2. 18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time)
  3. 18/08/19 16:47:45 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, value=[GetGroups], valueName=Time)
  4. 18/08/19 16:47:45 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
  5. 18/08/19 16:47:45 DEBUG security.Groups: Creating new Groups object
  6. 18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
  7. 18/08/19 16:47:45 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
  8. 18/08/19 16:47:45 DEBUG util.NativeCodeLoader: java.library.path=/Users/rabbit/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
  9. 18/08/19 16:47:45 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  10. 18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
  11. 18/08/19 16:47:45 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
  12. 18/08/19 16:47:45 DEBUG util.Shell: Failed to detect a valid hadoop home directory
  13. java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
  14. at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302)
  15. at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327)
  16. at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
  17. at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
  18. at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
  19. at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
  20. at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:257)
  21. at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:234)
  22. at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:749)
  23. at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:734)
  24. at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:607)
  25. at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2748)
  26. at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2740)
  27. at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
  28. at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
  29. at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:22)
  30. 18/08/19 16:47:45 DEBUG util.Shell: setsid is not available on this machine. So not using it.
  31. 18/08/19 16:47:45 DEBUG util.Shell: setsid exited with exit code 0
  32. 18/08/19 16:47:45 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
  33. 18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login
  34. 18/08/19 16:47:45 DEBUG security.UserGroupInformation: hadoop login commit
  35. 18/08/19 16:47:45 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: rabbit
  36. 18/08/19 16:47:45 DEBUG security.UserGroupInformation: UGI loginUser:rabbit (auth:SIMPLE)
  37. 18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
  38. 18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
  39. 18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
  40. 18/08/19 16:47:46 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path =
  41. 18/08/19 16:47:46 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
  42. 18/08/19 16:47:46 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@12405818
  43. 18/08/19 16:47:46 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@7ff2a664
  44. 18/08/19 16:47:46 DEBUG shortcircuit.DomainSocketFactory: Both short-circuit local reads and UNIX domain socket are disabled.
  45. 18/08/19 16:47:46 DEBUG ipc.Client: The ping interval is 60000 ms.
  46. 18/08/19 16:47:46 DEBUG ipc.Client: Connecting to /192.168.143.150:54310
  47. 18/08/19 16:47:46 DEBUG ipc.Client: closing ipc connection to 192.168.143.150/192.168.143.150:54310: Connection refused
  48. java.net.ConnectException: Connection refused
  49. at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
  50. at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
  51. at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
  52. at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
  53. at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
  54. at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
  55. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
  56. at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
  57. at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
  58. at org.apache.hadoop.ipc.Client.call(Client.java:1382)
  59. at org.apache.hadoop.ipc.Client.call(Client.java:1364)
  60. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
  61. at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
  62. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  63. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  64. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  65. at java.lang.reflect.Method.invoke(Method.java:498)
  66. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  67. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  68. at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
  69. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
  70. at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
  71. at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
  72. at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
  73. at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
  74. at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
  75. at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
  76. at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
  77. at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
  78. at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
  79. at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  80. at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
  81. at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
  82. at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
  83. 18/08/19 16:47:46 DEBUG ipc.Client: IPC Client (775931202) connection to /192.168.143.150:54310 from rabbit: closed
  84. 18/08/19 16:47:46 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@7ff2a664
  85. 18/08/19 16:47:46 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@7ff2a664
  86. 18/08/19 16:47:46 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@7ff2a664
  87. 18/08/19 16:47:46 DEBUG ipc.Client: Stopping client
  88. Exception in thread "main" java.net.ConnectException: Call From rabbit/127.0.0.1 to 192.168.143.150:54310 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
  89. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  90. at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  91. at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  92. at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  93. at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
  94. at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
  95. at org.apache.hadoop.ipc.Client.call(Client.java:1415)
  96. at org.apache.hadoop.ipc.Client.call(Client.java:1364)
  97. at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
  98. at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
  99. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  100. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  101. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  102. at java.lang.reflect.Method.invoke(Method.java:498)
  103. at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
  104. at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
  105. at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
  106. at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:225)
  107. at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1165)
  108. at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1155)
  109. at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1145)
  110. at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:268)
  111. at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:235)
  112. at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:228)
  113. at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1318)
  114. at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:293)
  115. at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:289)
  116. at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
  117. at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:289)
  118. at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
  119. at com.rabbit.hdfs.io.ReadFileDataToConsole.main(ReadFileDataToConsole.java:29)
  120. Caused by: java.net.ConnectException: Connection refused
  121. at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
  122. at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
  123. at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
  124. at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
  125. at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
  126. at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:606)
  127. at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:700)
  128. at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
  129. at org.apache.hadoop.ipc.Client.getConnection(Client.java:1463)
  130. at org.apache.hadoop.ipc.Client.call(Client.java:1382)
  131. ... 24 more

我把这个链接作为我的参考。我正在绞尽脑汁进一步调试。我不知道我哪里出错了。有人能帮我整理一下吗?

tez616oj

tez616oj1#

添加ip地址 192.168.143.150/etc/hosts 喜欢

  1. 192.168.143.150 192.168.143.150
  2. 127.0.0.1 localhost

这帮助我解决了问题。谢谢:)

相关问题