java.lang.noclassdeffounderror:org/apache/kafka/common/message/kafkalz4blockoutputstream

p3rjfoxz  于 2021-06-07  发布在  Kafka
关注(0)|答案(3)|浏览(592)

我在使用spark流api时遇到noclassdeffounderrorerror错误。这是我的流代码。
我知道这是一些错误的jar和依赖项的问题,但我不知道这到底是什么。
我正在使用Kafka0.9.0,Spark1.6.1-这些依赖关系是好的还是我需要更改它们?我在下面附上pom.xml。
这是我正在使用的流api。
javapairinputdstream directkafkastream=kafkautils.createdirectstream(jsc,string.class,byte[].class,stringdecoder.class,defaultdecoder.class,kafkaparams,topicset);
这是我的密码。我在时收到错误(itr.next())

directKafkaStream.foreachRDD(rdd -> {

    rdd.foreachPartition(itr -> {

        try {

            while (itr.hasNext()) {

java.lang.noclassdeffounderror:org/apache/kafka/common/message/kafkalz4blockoutputstream
这是我的pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>com.abcd.rep.xyz</groupId>
        <artifactId>xyz</artifactId>
        <version>1.0</version>
        <relativePath>../pom.xml</relativePath>
    </parent>
        <artifactId>SparkPOC</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>
        <name>SparkPOCde</name>
        <url>http://maven.apache.org</url>
<properties>

<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

<spark-version>1.6.1</spark-version>

<kafka-version>0.9.0.0</kafka-version>

</properties>

<dependencies>

<!-- http://mvnrepository.com/artifact/org.springframework/spring-core -->

<!-- http://mvnrepository.com/artifact/org.springframework/spring-jdbc -->
<dependency>

<groupId>log4j</groupId>

<artifactId>log4j</artifactId>

<version>1.2.17</version>

</dependency>

<!-- http://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->

<dependency>
<groupId>org.apache.spark</groupId>

<artifactId>spark-streaming_2.10</artifactId>

<version>${spark-version}</version>

</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka_2.10 -->

<dependency>

<groupId>org.apache.spark</groupId>

<artifactId>spark-streaming-kafka_2.10</artifactId>

<version>1.6.2</version>

 <exclusions>
        <exclusion>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>

        <exclusion>
            <groupId>io.jboss.netty</groupId>
            <artifactId>netty</artifactId>
        </exclusion>
    </exclusions> 

</dependency>

   <dependency>
            <groupId>com.abcd.rep.xyz</groupId>
            <artifactId>xyzCommon</artifactId>
            <version>1.0</version>
            <type>jar</type>
        </dependency>

<!-- http://mvnrepository.com/artifact/ojdbc/ojdbc -->

<!-- <dependency> <groupId>ojdbc</groupId> <artifactId>ojdbc</artifactId> <version>14</version> </dependency>-->

<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->

<!-- http://mvnrepository.com/artifact/org.springframework.data/spring-data-mongodb -->

<!-- https://mvnrepository.com/artifact/com.googlecode.json-simple/json-simple -->

</dependencies>

<build>

<finalName>appname</finalName>

<resources>

<resource>

<directory>src/main/resources</directory>

<excludes>

<exclude>eventRules.json</exclude>

<exclude>log4j.xml</exclude>

<exclude>resources.properties</exclude>

</excludes>

</resource>

</resources>

<plugins>

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-compiler-plugin</artifactId>

<version>3.5.1</version>

<configuration>

<source>1.8</source>

<target>1.8</target>

</configuration>

</plugin>

<plugin>

<groupId>org.apache.maven.plugins</groupId>

<artifactId>maven-shade-plugin</artifactId>

<version>2.4.3</version>

<executions>

<execution>

<phase>package</phase>

<goals>

<goal>shade</goal>

</goals>

<configuration>

<filters>

<filter>

<artifact>*:*</artifact>

-<excludes>

<exclude>META-INF/*.SF</exclude>

<exclude>META-INF/*.DSA</exclude>

<exclude>META-INF/*.RSA</exclude>

</excludes>

</filter>

</filters>

<transformers>

-<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

<resource>reference.conf</resource>

</transformer>

-<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

<mainClass>com.abcd.rep.xyz.SparkPOCde.EventConsumerServiceImpl</mainClass>

</transformer>

</transformers>

</configuration>

</execution>

</executions>

</plugin>

</plugins>

</build>

</project>
bgtovc5b

bgtovc5b1#

我使用0.8.2.2版本的kafka jar来解决这个问题。

eh57zj3b

eh57zj3b2#

虽然我的kafka集群版本是0.9.0.0。我用maven pom来处理kafka和spark流。

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.0</version>
</dependency>

,但我得到如上所述的错误。然后我尝试添加依赖,如下所示,它是有效的。

<dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka_2.10</artifactId>
        <version>0.8.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>0.8.2.1</version>
    </dependency>
0lvr5msh

0lvr5msh3#

kafkalz4blockoutputstream在kafka客户机jar中。
直到kafka clients版本0.8.2.2,它都在org/apache/kafka/common/message/kafkalz4blockoutputstream中
从0.9.0.0开始,它位于/org/apache/kafka/common/record中/

相关问题