启动spark repl时出错

wh6knrhe  于 2021-05-29  发布在  Hadoop
关注(0)|答案(2)|浏览(347)

我已经预装了spark 1.4.1,我正在运行HDP2.6。当我想跑的时候 spark-shell 它给我一个错误信息如下。

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
    at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:111)
    at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:111)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:111)
    at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:97)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:107)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

问题是什么?

1sbrub3j

1sbrub3j1#

ClassNotFoundException 当类加载器在类路径中找不到所需的类时发生。所以,基本上你应该检查你的类路径并在类路径中添加类。
检查是否 hadoop-common-0.21.0.jar 已添加到类路径。

qeeaahzv

qeeaahzv2#

有没有可能你的hadoop主页没有设置,就像这里一样?
找不到hadoop安装:$hadoop\u home必须设置或hadoop必须位于路径中

相关问题