scala SpkR:示例化‘org.apache.park k.sql.hive.HiveSessionState’时出错:

qcuzuvrc  于 2022-11-09  发布在  Scala
关注(0)|答案(2)|浏览(133)

当我尝试使用SPEKR时,我正在努力解决这个问题。

sparkR.session(master = "local[*]", sparkConfig = list(spark.driver.memory = "1g"))
Error in handleErrors(returnStatus, conn) : 
  java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
    at org.apache.spark.sql.api.r.SQLUtils$$anonfun$setSparkContextSessionConf$2.apply(SQLUtils.scala:67)
    at org.apache.spark.sql.api.r.SQLUtils$$anonfun$setSparkContextSessionConf$2.apply(SQLUtils.scala:66)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.Traversabl

希望这是一个明确的解决方案,我是新手,我对Java Scala一无所知。非常感谢!

qfe3c7zg

qfe3c7zg1#

我也犯了同样的错误。看起来它与用户权限有关。因此,您有两个选择:
1)从您有必要权限的目录启动SparkR(前提条件:Spark bin文件夹需要包含在路径env-变量:export PATH=$SPARK_HOME/bin:$PATH中):

cd ~
sparkR

2)使用sudo权限启动spakR:

/opt/spark/bin $ sudo ./sparkR
mf98qq94

mf98qq942#

请尝试从环境变量中删除HADOOP_CONF_DIR

相关问题