为什么我不能用scala安装apachespark

cld4siwp  于 2021-07-09  发布在  Spark
关注(0)|答案(0)|浏览(308)

在过去的几个月里,我一直试图以某种方式在我的电脑上安装apachespark和scala。今晚我按照这里的指示试了三次:https://sparkbyexamples.com/spark/spark-setup-run-with-scala-intellij/#google_vignette.
我现在得到这个错误:

scalac: 
  bad constant pool index: 318 at pos: 2716
     while compiling: C:\Users\cohnb\IdeaProjects\spark-hello-world-example\src\test\scala\org\example\MySpec.scala
  last tree to typer: Ident(JUnit4)
       tree position: line 6 of C:\Users\cohnb\IdeaProjects\spark-hello-world-example\src\test\scala\org\example\MySpec.scala
              symbol: <none>
   symbol definition: <none> (a NoSymbol)
      symbol package: <none>
       symbol owners: 
           call site: class MySpecTest in package example in package example

==树位置的源文件上下文==

3 import org.specs._
 4 import org.specs.runner.{ConsoleRunner, JUnit4}
 5 
 6 class MySpecTest extends JUnit4(MySpec)
 7 //class MySpecSuite extends ScalaTestSuite(MySpec)
 8 object MySpecRunner extends ConsoleRunner(MySpec)
 9

为什么我不能让spark去工作?谢谢!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题