升级配置单元metastore后,apache spark 2.2.0无法连接到metastore

b0zn9rqh  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(552)

运行spark shell时出现以下错误

  1. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  2. Setting default log level to "WARN".
  3. To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
  4. 18/01/30 18:22:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  5. 18/01/30 18:22:29 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
  6. 18/01/30 18:22:29 WARN Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
  7. java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
  8. Caused by: scala.MatchError: 2.3.0 (of class java.lang.String)
  9. at
  10. org.apache.spark.sql.hive.client.IsolatedClientLoader$.hiveVersion(IsolatedClientLoader.scala:89)
  11. at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:279)
  12. at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
  13. at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
  14. at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
  15. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
  16. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  17. at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  18. at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
  19. at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
  20. at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
  21. at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
  22. at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
  23. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
  24. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
  25. at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
  26. at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
  27. at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
  28. ... 61 more
  29. <console>:14: error: not found: value spark
  30. import spark.implicits._
  31. ^
  32. <console>:14: error: not found: value spark
  33. import spark.sql
  34. ^
kmynzznz

kmynzznz1#

今天(spark 2.3.0 rc)支持的最新版本是2.1。相关代码:

  1. def hiveVersion(version: String): HiveVersion = version match {
  2. case "12" | "0.12" | "0.12.0" => hive.v12
  3. case "13" | "0.13" | "0.13.0" | "0.13.1" => hive.v13
  4. case "14" | "0.14" | "0.14.0" => hive.v14
  5. case "1.0" | "1.0.0" => hive.v1_0
  6. case "1.1" | "1.1.0" => hive.v1_1
  7. case "1.2" | "1.2.0" | "1.2.1" | "1.2.2" => hive.v1_2
  8. case "2.0" | "2.0.0" | "2.0.1" => hive.v2_0
  9. case "2.1" | "2.1.0" | "2.1.1" => hive.v2_1
  10. }

不支持配置单元2.3。

相关问题