我知道关于这个异常也有类似的问题,但是他们都没有解决我的问题。我有一个java应用程序。最近我不得不从Java 17升级到21,这导致Apache Spark也升级到3.5.0。在升级之前,我的应用程序运行没有任何问题。在所有升级之后,我运行应用程序时遇到以下异常:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:109)
at java.base/java.lang.reflect.Method.invoke(Method.java:580)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88)
Caused by: java.lang.ClassCastException: class org.apache.logging.slf4j.SLF4JLoggerContext cannot be cast to class org.apache.logging.log4j.core.LoggerContext (org.apache.logging.slf4j.SLF4JLoggerContext and org.apache.logging.log4j.core.LoggerContext are in unnamed module of loader org.springframework.boot.loader.LaunchedURLClassLoader @20fa23c1)
at org.apache.spark.util.Utils$.setLogLevel(Utils.scala:2318)
at org.apache.spark.SparkContext.setLogLevel(SparkContext.scala:399)
at org.apache.spark.api.java.JavaSparkContext.setLogLevel(JavaSparkContext.scala:673)
at cz.cuni.matfyz.mminfer.wrappers.MongoDBSchemaLessWrapper.initiateContext(MongoDBSchemaLessWrapper.java:72)
at cz.cuni.matfyz.mminfer.algorithms.rba.RecordBasedAlgorithm.process(RecordBasedAlgorithm.java:27)
at cz.cuni.matfyz.mminfer.MMInferOneInAll.main(MMInferOneInAll.java:45)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
... 5 more
字符串
这就是我运行应用程序的方式(命令行选项用于确保Spark 3.5.0和Java 21的兼容性):
java --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED -jar target/MM-Infer-One-In-All-1.0-SNAPSHOT.jar C:\Users\alzbe\Documents\mff_mgr\Diplomka\Apps\temp\checkpoint srutkova yelpbusinesssample
型
这是我的pom文件中的相应依赖项:
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.5.0</version>
<!--<scope>provided</scope>-->
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.5.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<version>2.4.13</version>
</dependency>
<!--https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver-->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.12.10</version>
</dependency>
<!--https://mvnrepository.com/artifact/org.mongodb.spark/mongo-spark-connector-->
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.12</artifactId>
<version>3.0.1</version>
<scope>compile</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter</artifactId>
<version>5.4.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
型
我试过切换到不同版本的spring Boot ,并排除slf 4j/log4j。(实际上排除log4j使我无法连接到我的MongoDB)然而,我所有的尝试都失败了。
有没有人有任何建议,可能是什么问题?
1条答案
按热度按时间ar7v8xwq1#
org.apache.logging.log4j.LoggerContext来自log4j-to-slf4j 2.7 org.apache.logging.log4j.core.LoggerContext来自log4j-to-slf4j 2.22
你的spring boot version使用(力)2.13,而Spark似乎需要2.22。所以你可以尝试的一件事是强迫2.22(如果你不知道,搜索你如何做到这一点,并仔细检查依赖关系任务,这是这种情况)如果这不起作用,你需要升级spring。Spring目前不支持2.22的任何版本,但可能是一个解决方案,与spring 3和强制2.22将工作(如果Spring2和强制2.22不)。否则,不幸的是,你不能使用最新的Spark