ApacheSpark

omtl5h9j  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(440)

我刚开始学习 apache spark . 我做的第一件事是我试图安装 spark 在我的机器上。我下载了带有 hadoop 2.6 . 当我跑的时候 spark shell 我有以下错误

  1. java.lang.RuntimeException: java.lang.NullPointerException
  2. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  3. at org.apache.spark.sql.hive.client.ClientWrapper.<init> (ClientWrapper.scala:171)
  4. at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala :163)
  5. at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
  6. at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
  7. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  8. at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
  9. at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
  10. at java.lang.reflect.Constructor.newInstance(Unknown Source)
  11. at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
  12. at $iwC$$iwC.<init>(<console>:9)
  13. at $iwC.<init>(<console>:18)
  14. at <init>(<console>:20)
  15. at .<init>(<console>:24)
  16. at .<clinit>(<console>)
  17. at .<init>(<console>:7)
  18. at .<clinit>(<console>)
  19. at $print(<console>)
  20. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  21. at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
  22. at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
  23. at java.lang.reflect.Method.invoke(Unknown Source)
  24. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  25. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
  26. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  27. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  28. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  29. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  30. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  31. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  32. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
  33. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
  34. at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
  35. at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
  36. at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
  37. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

我搜索了这个错误,发现我必须下载winutils.exe,我设置了路径 HADOOP_HOME = "c:\Hadoop" 然后执行命令

  1. C:\Hadoop\bin\winutils.exe chmod 777 /tmp/hive

但我犯了以下错误

  1. This version of C:\Hadoop\bin\winutils.exe is not compatible with the version of
  2. Windows you're running. Check your computer's system information to see whether
  3. you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contac
  4. t the software publisher.

我试图搜索32位版本的winutils.exe,但找不到。。请帮我安装。先谢谢你

qnzebej0

qnzebej01#

以下链接可能会有所帮助。
https://issues.apache.org/jira/browse/hadoop-9922
https://issues.apache.org/jira/browse/hadoop-11784
找不到适用于32位windows的hadoop 2.6.0的winutils.exe

相关问题