我正在学习如何使用AWS免费层(Ubuntu 20.04.2)4个节点设置Hadoop环境,当我键入以下内容时:start-dfs.sh在我的名称节点上,由于某些原因,它不会启动数据节点:
记录档:
2021-10-24 12:46:46,876 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
2021-10-24 12:46:46,888 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
2021-10-24 12:46:46,888 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
2021-10-24 12:46:46,929 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Checkpoint Period :3600 secs (60 min)
2021-10-24 12:46:46,929 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Log Size Trigger :1000000 txns
2021-10-24 12:46:46,951 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for secondary at: http://0.0.0.0:9868
2021-10-24 12:46:47,018 INFO org.eclipse.jetty.util.log: Logging initialized @6209ms to org.eclipse.jetty.util.log.Slf4jLog
2021-10-24 12:46:47,419 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
2021-10-24 12:46:47,448 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.secondary is not defined
2021-10-24 12:46:47,469 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2021-10-24 12:46:47,470 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context secondary
2021-10-24 12:46:47,470 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
2021-10-24 12:46:47,470 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
2021-10-24 12:46:47,604 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 9868
2021-10-24 12:46:47,605 INFO org.eclipse.jetty.server.Server: jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 11.0.11+9-Ubuntu-0ubuntu2.20.04
2021-10-24 12:46:47,725 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0
2021-10-24 12:46:47,725 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults
2021-10-24 12:46:47,726 INFO org.eclipse.jetty.server.session: node0 Scavenging every 600000ms
2021-10-24 12:46:47,780 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@6d868997{logs,/logs,file:///usr/local/hadoop/logs/,AVAILABLE}
2021-10-24 12:46:47,784 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@3af37506{static,/static,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/static/,AVAILABLE}
2021-10-24 12:46:48,186 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@783efb48{secondary,/,file:///usr/local/hadoop/share/hadoop/hdfs/webapps/secondary/,AVAILABLE}{file:/usr/local/hadoop/share/hadoop/hdfs/webapps/secondary}
2021-10-24 12:46:48,233 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@7f9e1534{HTTP/1.1,[http/1.1]}{0.0.0.0:9868}
2021-10-24 12:46:48,233 INFO org.eclipse.jetty.server.Server: Started @7424ms
2021-10-24 12:46:48,233 INFO org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode: Web server init done
我已经尝试删除临时文件和重新格式化我的namenode,但同样的问题。
PS:我用的是Hadoop 3.2.2
1条答案
按热度按时间dpiehjr41#
通过降级到hadoop 2.10.1修复了此问题。