我正在将现有的spark流应用程序从spark2.3迁移到spark3.1.1。我已经更新了下面提到的spark依赖项
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.12</artifactId>
<version>3.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.12</artifactId>
<version>3.1.1</version>
</dependency>
增加了Phoenix城的依赖性
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-core</artifactId>
<version>4.13.2-cdh5.11.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-spark</artifactId>
<version>4.13.2-cdh5.11.2</version>
<scope>provided</scope>
</dependency>
当我在做spark的工作时 java.lang.ClassNotFoundException: org.apache.phoenix.jdbc.PhoenixDriver
. 以前运行spark2.3的代码是一样的,但现在不是spark3.1.1。
java.lang.ClassNotFoundException: org.apache.phoenix.jdbc.PhoenixDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
有人能帮我知道我在这里错过了什么吗?
暂无答案!
目前还没有任何答案,快来回答吧!