sparkshell:系统找不到指定的路径

fzsnzjdm  于 2021-07-14  发布在  Spark
关注(0)|答案(1)|浏览(509)

在我安装了anaconda软件包之后,我不能再在windows7下启动sparkshell了。我打字的时候 spark-shell ,控制台以 The system cannot find the path specified. 当然,Spark壳不会启动。
我有以下几点 echo %PATH% :
c:\program files\microsoft mpi\bin\;c:\program files(x86)\common files\intel\shared files\cpp\bin\intel64;c:\program files(x86)\intel\icls客户端\;c:\programfiles\intel\icls客户端\;c:\windows\system32;c:\窗口;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\program files\intel\intel(r)管理引擎组件\dal;c:\program files(x86)\intel\intel(r)management engine components\dal;c:\program files\intel\intel(r)管理引擎组件\ipt;c:\program files(x86)\intel\intel(r)management engine components\ipt;c:\program files\lenovo\fingerprint manager pro\;c:\program files(x86)\winscp\;c:\program files(x86)\lenovo\access connections\;c:\program files\miktex 2.9\miktex\bin\x64\;c:\program files\putty\;c:\program files(x86)\intel\ucrt\;c:\program files\intel\ucrt\;c:\program files\intel\wifi\bin\;c:\program files\common files\intel\wirelesscommon\;c:\program files\microsoft sql server\130\tools\binn\;c:\program files\dotnet\;c:\程序文件\anaconda3;c:\程序文件\anaconda3\脚本;c:\程序文件\anaconda3\library\bin;c:\程序文件(x86)\gtksharp\2.12\bin;c:\程序文件\git\cmd;c:\程序文件\tortoisegit\bin;c:\程序文件\tortoisesvn\bin;c:\程序文件(x86)\sbt\bin;c:\program files(x86)\scala\bin;c:\program files(x86)\java\jre1.8.0\u 144\bin;c:\program files\intel\wifi\bin\;c:\program files\common files\intel\wirelesscommon\;c:\program files(x86)\graphviz2.38\bin\;c:\程序文件(x86)\sbt\bin;c:\program files(x86)\scala\bin;d:\spark\bin;d:\hadoop\bin文件
以下是 echo %SPARK_HOME% :
d:\Spark
以下是 echo %JAVA_HOME% :
c:\程序文件(x86)\java\jre1.8.0\U 144
这是我的 java -version :
java版本“1.8.0\U 144”
java(tm)se运行时环境(build 1.8.0144-b01)
java hotspot(tm)客户端虚拟机(内部版本25.144-b01,混合模式,共享)
我已经尝试重新安装java,但是没有成功。这里有一个类似的问题,但我在设置中没有看到任何错误的环境变量。所以我真的不知道怎么解决这个问题。。。有什么想法吗?
经过一番测试我发现 cd 进入 $SPARK_HOME$\bin 我真的可以执行 spark-shell . 退出时会显示一条错误消息:
\java\jre1.8.0\u 144\bin\java此时是意外的。
执行最后一行时出现此错误 "%~dp0spark-class2.cmd" %CLASS% %*Spark\bin\spark-submit2.cmd .
更新1:
更改 %JAVA_HOME% 从“c:\program files…”到“c:\progra~1…”确实在某些方面解决了这个问题: spark-shell 现在似乎要开始了。然而,有很多 Access denied 错误:

  1. java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
  2. at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta
  3. ntiateSessionState(SparkSession.scala:1053)
  4. at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio
  5. n.scala:130)
  6. at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSessio
  7. n.scala:130)
  8. at scala.Option.getOrElse(Option.scala:121)
  9. at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scal
  10. a:129)
  11. at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
  12. at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar
  13. kSession.scala:938)
  14. at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(Spar
  15. kSession.scala:938)
  16. at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  17. at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
  18. at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
  19. at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  20. at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
  21. at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:93
  22. 8)
  23. at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97)
  24. ... 47 elided
  25. Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: j
  26. ava.lang.RuntimeException: java.io.IOException: Access is denied;
  27. at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo
  28. g.scala:106)
  29. at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCa
  30. talog.scala:193)
  31. at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(Shared
  32. State.scala:105)
  33. at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala
  34. :93)
  35. at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessi
  36. onStateBuilder.scala:39)
  37. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSe
  38. ssionStateBuilder.scala:54)
  39. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB
  40. uilder.scala:52)
  41. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateB
  42. uilder.scala:35)
  43. at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStat
  44. eBuilder.scala:289)
  45. at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$insta
  46. ntiateSessionState(SparkSession.scala:1050)
  47. ... 61 more
  48. Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOExc
  49. eption: Access is denied
  50. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
  51. at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala
  52. :191)
  53. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  54. at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
  55. at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
  56. at java.lang.reflect.Constructor.newInstance(Unknown Source)
  57. at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(Isolated
  58. ClientLoader.scala:264)
  59. at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:3
  60. 62)
  61. at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:2
  62. 66)
  63. at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExterna
  64. lCatalog.scala:66)
  65. at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.sc
  66. ala:65)
  67. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
  68. ly$mcZ$sp(HiveExternalCatalog.scala:194)
  69. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
  70. ly(HiveExternalCatalog.scala:194)
  71. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.app
  72. ly(HiveExternalCatalog.scala:194)
  73. at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalo
  74. g.scala:97)
  75. ... 70 more
  76. Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied
  77. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:515)
  78. ... 84 more
  79. Caused by: java.io.IOException: Access is denied
  80. at java.io.WinNTFileSystem.createFileExclusively(Native Method)
  81. at java.io.File.createTempFile(Unknown Source)
  82. at org.apache.hadoop.hive.ql.session.SessionState.createTempFile(SessionState.
  83. java:818)
  84. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
  85. ... 84 more
  86. <console>:14: error: not found: value spark
  87. import spark.implicits._
  88. ^
  89. <console>:14: error: not found: value spark
  90. import spark.sql
  91. ^

更新2:
跑步 spark-shell 作为一个管理员工作!然而,这可能是非常不安全的,我不认为这是一个真正的解决办法。

w8ntj3qf

w8ntj3qf1#

确保您已经正确地设置了java\u home和sbt\u home,为了安全起见,我还将它们添加到path变量中。要轻松做到这一点,我可以推荐“rapid environment editor”,这是一个简单而不错的工具,用于编辑系统变量。这种方法使我的工作,因为我得到了同样的问题,你是。例如:
java\u home设置为c:\program files\java\jdk1.8.0\u 151
sbt\u home设置为c:\program files(x86)\sbt\

相关问题