java.lang.noclassdeffounderror:org/apache/spark/logging

pxy2qtax  于 2021-05-27  发布在  Spark
关注(0)|答案(12)|浏览(649)

我总是犯以下错误。有人能帮我吗?

  1. Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
  2. at java.lang.ClassLoader.defineClass1(Native Method)
  3. at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
  4. at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
  5. at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
  6. at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
  7. at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
  8. at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
  9. at java.security.AccessController.doPrivileged(Native Method)
  10. at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
  11. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  12. at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  13. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  14. at com.datastax.spark.connector.japi.DStreamJavaFunctions.<init>(DStreamJavaFunctions.java:24)
  15. at com.datastax.spark.connector.japi.CassandraStreamingJavaUtil.javaFunctions(CassandraStreamingJavaUtil.java:55)
  16. at SparkStream.main(SparkStream.java:51)
  17. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  18. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  19. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  20. at java.lang.reflect.Method.invoke(Method.java:498)
  21. at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
  22. Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
  23. at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  24. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  25. at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  26. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  27. ... 20 more

当我编译以下代码时。我在网上搜索过,但没有找到解决办法。我在添加savetocassandra时出错了。

  1. import com.datastax.spark.connector.japi.CassandraStreamingJavaUtil;
  2. import org.apache.spark.SparkConf;
  3. import org.apache.spark.api.java.JavaSparkContext;
  4. import org.apache.spark.streaming.Duration;
  5. import org.apache.spark.streaming.api.java.JavaDStream;
  6. import org.apache.spark.streaming.api.java.JavaPairInputDStream;
  7. import org.apache.spark.streaming.api.java.JavaStreamingContext;
  8. import org.apache.spark.streaming.kafka.KafkaUtils;
  9. import java.io.Serializable;
  10. import java.util.Collections;
  11. import java.util.HashMap;
  12. import java.util.Map;
  13. import java.util.Set;
  14. import static com.datastax.spark.connector.japi.CassandraJavaUtil.mapToRow;
  15. /**
  16. * Created by jonas on 10/10/16.
  17. */
  18. public class SparkStream implements Serializable{
  19. public static void main(String[] args) throws Exception{
  20. SparkConf conf = new SparkConf(true)
  21. .setAppName("TwitterToCassandra")
  22. .setMaster("local[*]")
  23. .set("spark.cassandra.connection.host", "127.0.0.1")
  24. .set("spark.cassandra.connection.port", "9042");
  25. ;
  26. JavaSparkContext sc = new JavaSparkContext(conf);
  27. JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(5000));
  28. Map<String, String> kafkaParams = new HashMap<>();
  29. kafkaParams.put("bootstrap.servers", "localhost:9092");
  30. Set<String> topics = Collections.singleton("Test");
  31. JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils.createDirectStream(
  32. ssc,
  33. String.class,
  34. String.class,
  35. kafka.serializer.StringDecoder.class,
  36. kafka.serializer.StringDecoder.class,
  37. kafkaParams,
  38. topics
  39. );
  40. JavaDStream<Tweet> createTweet = directKafkaStream.map(s -> createTweet(s._2));
  41. CassandraStreamingJavaUtil.javaFunctions(createTweet)
  42. .writerBuilder("mykeyspace", "rawtweet", mapToRow(Tweet.class))
  43. .saveToCassandra();
  44. ssc.start();
  45. ssc.awaitTermination();
  46. }
  47. public static Tweet createTweet(String rawKafka){
  48. String[] splitted = rawKafka.split("\\|");
  49. Tweet t = new Tweet(splitted[0], splitted[1], splitted[2], splitted[3]);
  50. return t;
  51. }
  52. }

我的pom如下。

  1. <?xml version="1.0" encoding="UTF-8"?>
  2. <project xmlns="http://maven.apache.org/POM/4.0.0"
  3. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  4. xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  5. <modelVersion>4.0.0</modelVersion>
  6. <groupId>com.company</groupId>
  7. <artifactId>Sentiment</artifactId>
  8. <version>1.0-SNAPSHOT</version>
  9. <build>
  10. <plugins>
  11. <plugin>
  12. <groupId>org.apache.maven.plugins</groupId>
  13. <artifactId>maven-compiler-plugin</artifactId>
  14. <configuration>
  15. <source>1.8</source>
  16. <target>1.8</target>
  17. </configuration>
  18. </plugin>
  19. </plugins>
  20. </build>
  21. <repositories>
  22. <repository>
  23. <id>twitter4j.org</id>
  24. <name>twitter4j.org Repository</name>
  25. <url>http://twitter4j.org/maven2</url>
  26. <releases>
  27. <enabled>true</enabled>
  28. </releases>
  29. <snapshots>
  30. <enabled>true</enabled>
  31. </snapshots>
  32. </repository>
  33. </repositories>
  34. <dependencies>
  35. <dependency>
  36. <groupId>org.apache.spark</groupId>
  37. <artifactId>spark-streaming_2.11</artifactId>
  38. <version>2.0.1</version>
  39. </dependency>
  40. <dependency>
  41. <groupId>org.apache.spark</groupId>
  42. <artifactId>spark-core_2.10</artifactId>
  43. <version>2.0.0</version>
  44. </dependency>
  45. <dependency>
  46. <groupId>org.apache.spark</groupId>
  47. <artifactId>spark-sql_2.10</artifactId>
  48. <version>2.0.0</version>
  49. </dependency>
  50. <dependency>
  51. <groupId>org.apache.spark</groupId>
  52. <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
  53. <version>2.0.1</version>
  54. </dependency>
  55. <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
  56. <dependency>
  57. <groupId>org.scala-lang</groupId>
  58. <artifactId>scala-library</artifactId>
  59. <version>2.11.8</version>
  60. </dependency>
  61. <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
  62. <dependency>
  63. <groupId>com.datastax.spark</groupId>
  64. <artifactId>spark-cassandra-connector_2.10</artifactId>
  65. <version>1.6.2</version>
  66. </dependency>
  67. <dependency>
  68. <groupId>org.apache.kafka</groupId>
  69. <artifactId>kafka_2.10</artifactId>
  70. <version>0.9.0.0</version>
  71. </dependency>
  72. <dependency>
  73. <groupId>org.twitter4j</groupId>
  74. <artifactId>twitter4j-core</artifactId>
  75. <version>[4.0,)</version>
  76. </dependency>
  77. <dependency>
  78. <groupId>org.twitter4j</groupId>
  79. <artifactId>twitter4j-stream</artifactId>
  80. <version>4.0.4</version>
  81. </dependency>
  82. <dependency>
  83. <groupId>org.twitter4j</groupId>
  84. <artifactId>twitter4j-async</artifactId>
  85. <version>4.0.4</version>
  86. </dependency>
  87. </dependencies>
  88. </project>
f87krz0w

f87krz0w1#

依赖项jar列表中缺少日志jar。尝试从mvn存储库下载“spark-core_2.11-1.5.2.logging”jar,然后将其作为外部jar添加到spark项目中,您不会得到“java.lang.nocdeflassfounderror:org/apache/spark/logging”错误。基于scala版本,您可以下载jar{2.10、2.11等}。

qmb5sa22

qmb5sa222#

org.apache.spark.logging在spark版本1.5.2或更低版本中提供。它不在2.0.0中。请更改以下版本

  1. <dependency>
  2. <groupId>org.apache.spark</groupId>
  3. <artifactId>spark-streaming_2.11</artifactId>
  4. <version>1.5.2</version>
  5. </dependency>
  6. <dependency>
  7. <groupId>org.apache.spark</groupId>
  8. <artifactId>spark-core_2.10</artifactId>
  9. <version>1.5.2</version>
  10. </dependency>
  11. <dependency>
  12. <groupId>org.apache.spark</groupId>
  13. <artifactId>spark-sql_2.10</artifactId>
  14. <version>1.5.2</version>
  15. </dependency>
  16. <dependency>
  17. <groupId>org.apache.spark</groupId>
  18. <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
  19. <version>1.6.2</version>
  20. </dependency>
展开查看全部
7ivaypg9

7ivaypg93#

此pom.xml用于解决我的问题:

  1. <dependency>
  2. <groupId>org.apache.spark</groupId>
  3. <artifactId>spark-core_2.10</artifactId>
  4. <version>1.6.1</version>
  5. </dependency>
  6. <dependency>
  7. <groupId>org.apache.spark</groupId>
  8. <artifactId>spark-streaming_2.10</artifactId>
  9. <version>1.6.1</version>
  10. </dependency>
q9rjltbz

q9rjltbz4#

这个错误是因为您正在使用spark 2.0库和spark 1.6中的连接器(它查找spark 1.6日志类)。使用2.0.5版本的连接器。

ego6inou

ego6inou5#

这是因为org.apache.spark.logging类从1.5.2开始就丢失了,就像大家说的那样(更高版本中只存在org.apache.spark.internal.logging…)
但是似乎没有任何针对maven的解决方案能够解决这个依赖性,所以我只是尝试手动将这个类添加到lib中。下面是我解决问题的方法:
打包scala org.apache.spark.internal.Logging 放进一个公共jar里。或者从https://raw.githubusercontent.com/swordsmanliu/sparkstreaminghbase/master/lib/spark-core_2.11-1.5.2.logging.jar (感谢这位主持人。)
将jar移到spark集群的jars目录中。
再次提交你的项目,希望它能帮助你。

wz3gfoph

wz3gfoph6#

我通过改变上面提到的jar得到了解决办法。
最初,我为spark kafka流媒体制作了降级jar:

  1. <dependency>
  2. <groupId>org.apache.spark</groupId>
  3. <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
  4. <version>2.1.1</version></dependency>

我还删除了multilple sl4j-log4j.jars和log4j.jars,它们是我从spark和kafka jar库外部添加的。

e0bqpujr

e0bqpujr7#

这是一个版本问题,请试用最新版本

  1. <dependency>
  2. <groupId>org.apache.spark</groupId>
  3. <artifactId>spark-sql_2.11</artifactId>
  4. <version>2.1.0</version>
  5. <scope>provided</scope>
  6. </dependency>
  7. <dependency>
  8. <groupId>org.apache.spark</groupId>
  9. <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
  10. <version>2.1.1</version>
  11. </dependency>
  12. <dependency>
  13. <groupId>org.apache.spark</groupId>
  14. <artifactId>spark-streaming_2.11</artifactId>
  15. <version>2.1.0</version>
  16. </dependency>
  17. <dependency>
  18. <groupId>org.apache.spark</groupId>
  19. <artifactId>spark-core_2.11</artifactId>
  20. <version>2.1.0</version>
  21. </dependency>
展开查看全部
8aqjt8rx

8aqjt8rx8#

下载了jar并使用--spark submit中的jars,为我工作spark submit--class com.continutwiteer--packages“org.apache”。spark:spark-streaming-twitter_2.11:1.6.3“--jars/root/desktop/spark-core\u 2.11-1.5.2.logging.jar/root/desktop/感情用事.jar xx

wgeznvg7

wgeznvg79#

如果您使用的是intellij,只需选中“include dependencies with provided scope”复选框,就可以解决这个问题,而无需使用pom或手动下载文件。

7xzttuei

7xzttuei10#

可能导致这个问题的一个原因是lib和类冲突。我面对这个问题,用一些maven排除法解决了它:

  1. <dependency>
  2. <groupId>org.apache.spark</groupId>
  3. <artifactId>spark-core_2.11</artifactId>
  4. <version>2.0.0</version>
  5. <scope>provided</scope>
  6. <exclusions>
  7. <exclusion>
  8. <groupId>log4j</groupId>
  9. <artifactId>log4j</artifactId>
  10. </exclusion>
  11. </exclusions>
  12. </dependency>
  13. <dependency>
  14. <groupId>org.apache.spark</groupId>
  15. <artifactId>spark-streaming_2.11</artifactId>
  16. <version>2.0.0</version>
  17. <scope>provided</scope>
  18. </dependency>
  19. <dependency>
  20. <groupId>org.apache.spark</groupId>
  21. <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
  22. <version>2.0.0</version>
  23. <exclusions>
  24. <exclusion>
  25. <groupId>org.slf4j</groupId>
  26. <artifactId>slf4j-log4j12</artifactId>
  27. </exclusion>
  28. <exclusion>
  29. <groupId>log4j</groupId>
  30. <artifactId>log4j</artifactId>
  31. </exclusion>
  32. </exclusions>
  33. </dependency>
展开查看全部
6fe3ivhb

6fe3ivhb11#

下载下面的jar,并将其放入您的库中,它将按预期工作。
https://raw.githubusercontent.com/swordsmanliu/sparkstreaminghbase/master/lib/spark-core_2.11-1.5.2.logging.jar

g0czyy6m

g0czyy6m12#

下载spark-core_2.11-1.5.2.logging.jar并使用as--jar选项
spark submit--class com.twitteer--packages“org.apache。spark:spark-streaming-twitter_2.11:1.6.3“--jars/root/desktop/spark-core\u 2.11-1.5.2.logging.jar/root/desktop/感情用事.jar consumerkey ConsumerCret accesstoken accesstokensecret yoursearchtag
https://github.com/sinhavicky4/sentimenttwiteer

相关问题