Apache Spark 运行scala code jar appear NoSuchMethodError:scala.Predef$.refArrayOps

huwehgph  于 2023-04-07  发布在  Apache
关注(0)|答案(2)|浏览(187)

我的代码可以在idea by local模式下正常运行,当我把它打印到jar包里,上传到我部署的SPARK服务器上运行时,NoSuchMethodError:scala. Predef $. refArrayOps出现了,出错的代码行如下val expectArray=expectVertex.take(2).toArray.sortBy(it=>{it_1}) expectVertex是一个scala map,它的键类型是graphx.VertexId,它的值类型是Int
我在用Spark简单代码做这个的时候也遇到过这个问题,这个错误发生的时候我是用一个数组函数的一行,代码就像这个包org.example

import org.apache.spark.graphx.{Edge, Graph}
import org.apache.spark.{SparkConf, SparkContext}

import java.util.logging.{Level, Logger}

/**
 * Hello world!
 *
 */
class App{
  def run(): Unit ={
    Logger.getLogger("org.apache.spark").setLevel(Level.WARNING)
    Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
    val conf = new SparkConf().setAppName("AXU test")
      .setMaster("local")
    val sc = new SparkContext(conf)
    val vertices = sc.parallelize(Array((1L, "A"), (2L, "B"), (3L, "C"), (4L, "D")))
    val edges = sc.parallelize(Array(Edge(1L, 2L, "friend"), Edge(2L, 3L, "follow"), Edge(3L, 4L, "friend")))
    val graph = Graph(vertices, edges)
    val inDegrees = graph.inDegrees
    inDegrees.collect().foreach(println)
    val deg = inDegrees.collect()
    for( i <- 0 to deg.length-1){
      print("this is no." + (i+1) + " point indegree:")
      println("id: " + deg(i)._1 + " value: " + deg(i)._2)
    }
    sc.stop()
  }
}

日志是

Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:65)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
    at org.example.App.run(App.scala:23)
    at org.example.Main$.main(Main.scala:6)
    at org.example.Main.main(Main.scala)

如果我删除第23行的代码,代码是inDegrees.collect().foreach(println),它可以正常工作。我的scala编译和运行版本都是2.12.7。看起来我不能在jar包中使用Array [T]. foreach或Array [T]. sortBy(it=〉{it_1})这样的方法(我使用Maven打包jar)。maven内容如下。

<properties>
        <scala.version>2.12.7</scala.version>
        <spark.version>2.4.4</spark.version>
    </properties>

    <build>
        <plugins>
            <plugin>
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.2.2</version>
                <executions>
                    <execution>
                        <id>compile-scala</id>
                        <phase>compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>compile</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>test-compile-scala</id>
                        <phase>test-compile</phase>
                        <goals>
                            <goal>add-source</goal>
                            <goal>testCompile</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <scalaVersion>${scala.version}</scalaVersion>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.0</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                    <archive>
                        <manifest>
                            <mainClass>org.example.Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>assembly</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>exec-maven-plugin</artifactId>
                <version>1.6.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>exec</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <executable>java</executable>
                    <includeProjectDependencies>true</includeProjectDependencies>
                    <includePluginDependencies>false</includePluginDependencies>
                    <classpathScope>compile</classpathScope>
                    <mainClass>org.example.Main</mainClass>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

有人能告诉我为什么会出现这个问题吗?提前谢谢你。

yhuiod9q

yhuiod9q1#

最有可能的情况是,您在本地使用Scala 2.12编译代码,但在服务器上运行的是Scala 2.13(或2.11?)。
尝试使用服务器上的Scala版本重新编译代码。
Scala 2.11、2.12、2.13不兼容二进制。
refArrayOps的签名在Scala 2.12中有所不同
def refArrayOps(scala.Array[scala.Any]): scala.Array[scala.Any](scalap)
public <T> T[] refArrayOps(T[])(javap)
implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ofRef[T](API)
Scala 2.13
def refArrayOps(scala.Array[scala.Any]): scala.Any(scalap)
public <T> T[] refArrayOps(T[])(javap)
implicit def refArrayOps[T <: AnyRef](xs: Array[T]): ArrayOps[T] @inline() [API]
Kafka start error on MAC .. something related to java and scala ... NoSuchMethodError: scala.Predef$.refArrayOps
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps
How do I fix a NoSuchMethodError?
java.lang.NoSuchMethodError: org.apache.hadoop.hive.common.FileUtils.mkdir while trying to save a table to Hive

wyyhbhjk

wyyhbhjk2#

classloader: scala.reflect.internal.util.ScalaClassLoader$URLClassLoader
classloader urls:
file:/home/hadoop/Spark/jdk/jre/lib/resources.jar
file:/home/hadoop/Spark/jdk/jre/lib/rt.jar
file:/home/hadoop/Spark/jdk/jre/lib/jsse.jar
file:/home/hadoop/Spark/jdk/jre/lib/jce.jar
file:/home/hadoop/Spark/jdk/jre/lib/charsets.jar
file:/home/hadoop/Spark/jdk/jre/lib/jfr.jar
file:/home/hadoop/Spark/scala/lib/jline-2.14.6.jar
file:/home/hadoop/Spark/scala/lib/scala-compiler.jar
file:/home/hadoop/Spark/scala/lib/scala-library.jar
file:/home/hadoop/Spark/scala/lib/scalap-2.12.7.jar
file:/home/hadoop/Spark/scala/lib/scala-parser-combinators_2.12-1.0.7.jar
file:/home/hadoop/Spark/scala/lib/scala-reflect.jar
file:/home/hadoop/Spark/scala/lib/scala-swing_2.12-2.0.3.jar
file:/home/hadoop/Spark/scala/lib/scala-xml_2.12-1.0.6.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunjce_provider.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/dnsns.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/jaccess.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/zipfs.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/nashorn.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunec.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/sunpkcs11.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/jfxrt.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/localedata.jar
file:/home/hadoop/Spark/jdk/jre/lib/ext/cldrdata.jar
file:/home/hadoop/./

相关问题