默认情况下未启用配置单元

slsn1g29  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(350)

似乎默认情况下没有启用配置单元。从“内存中”开始,我如何切换sparksql来使用hive
这是spark 2.4.5

scala> :paste
// Entering paste mode (ctrl-D to finish)

import org.apache.spark.sql.SparkSession
val spark = SparkSession
  .builder
  .enableHiveSupport()  // <-- enables Hive support
  .getOrCreate

// Exiting paste mode, now interpreting.

java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
  at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:869)
  ... 49 elided

scala>

scala> sql("set spark.sql.catalogImplementation").show(false)
+-------------------------------+---------+
|key                            |value    |
+-------------------------------+---------+
|spark.sql.catalogImplementation|in-memory|
+-------------------------------+---------+

update-1 no luck-显式设置值-spark.sql.catalogimplementation=hive

spark-shell --conf spark.sql.catalogImplementation=hive

Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1595914807895).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.5
      /_/

Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_232)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sql("set spark.sql.catalogImplementation").show(false)
+-------------------------------+---------+
|key                            |value    |
+-------------------------------+---------+
|spark.sql.catalogImplementation|in-memory|
+-------------------------------+---------+

更新-2不走运,这次我指定了jars

spark-shell   --jars     $HIVE_HOME/lib/hive-metastore-3.1.2.jar,    $HIVE_HOME/lib/hive-exec-3.1.2.jar,    $HIVE_HOME/lib/hive-common-3.1.2.jar,    $HIVE_HOME/lib/hive-serde-3.1.2.jar   --conf spark.sql.hive.metastore.version=3.1.2   --conf spark.sql.hive.metastore.jars=$HIVE_HOME"/lib/*"   --conf spark.sql.warehouse.dir=hdfs://localhost:9000/user/hive/warehouse --conf spark.sql.catalogImplementation=hive

    scala> sql("set spark.sql.catalogImplementation").show(false)
    +-------------------------------+---------+
    |key                            |value    |
    +-------------------------------+---------+
    |spark.sql.catalogImplementation|in-memory|
    +-------------------------------+---------+

scala>:粘贴//进入粘贴模式(按ctrl-d键完成)

import org.apache.spark.sql.SparkSession
val spark = SparkSession
  .builder
  .enableHiveSupport()  // <-- enables Hive support
  .getOrCreate

// Exiting paste mode, now interpreting.

java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
  at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:869)
  ... 57 elided

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题