Intellij Idea Spark RDD ADI's using scala programming intelliJ

vdzxcuhz  于 2023-10-15  发布在  Spark
关注(0)|答案(1)|浏览(84)

我想使用Scala创建一个Spark API程序,当添加Spark上下文时,出现了这个异常。”
我有:

  • spark版本3.5.0
  • scala版本2.12.18
  • jdk-19
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x47542153) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x47542153
    at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala:213)
    at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:121)
    at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:358)
    at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:295)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:344)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:196)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:284)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:483)
    at ReadCSVFile$.main(ReadCSVFile.scala:15)
    at ReadCSVFile.main(ReadCSVFile.scala)
cidc1ykv

cidc1ykv1#

https://spark.apache.org/docs/latest/ jdk-19是不支持的jdk版本:
Spark运行在Java 8/11/17,Scala 2.12/2.13,Python 3.8+和R 3.5+上。Java 8 8 u371之前的版本支持从Spark 3.5.0开始被弃用。

相关问题