spark-sql:value implicits不是org.apache.spark.sql.sqlcontext的成员

falq053o  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(432)

嗨,请找出下面的代码和相应的错误:即使我使用了导入语句,但仍然给出错误

import org.apache.spark.sql._

val sparkConf = new SparkConf().setAppName("new_proj")
implicit val sc = new SparkContext(sparkConf)

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
import sqlContext.implicits._

val projects = sqlContext.read.json("/part-m-00000.json")

[错误]/mapr/trans。scala:25:value implicits不是org.apache.spark.sql.sqlcontext[error]import sqlcontext.implicits.[error]^[error]/mapr/ppm\u trans的成员。scala:28:value read不是org.apache.spark.sql.sqlcontext[error]val projects=sqlcontext.read.json(“/mapr//part-m-00000.json”)的成员

cbeh67ev

cbeh67ev1#

我可以通过更改build.sbt中的以下行来编译代码:

libraryDependencies ++= Seq(
  "org.apache.spark"  % "spark-core_2.10"              % "1.4.0" % "provided",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-sql_2.10"               % "1.4.0",
  "org.apache.spark"  % "spark-mllib_2.10"             % "1.4.0"
  )

相关问题