无法访问元存储不应在运行时访问此类

8yparm6h  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(306)

当我试图从spark向配置单元写入/保存数据时,出现以下错误。同样在运行之前,我正在从终端启动hive--service metastore,但是仍然收到访问metastore失败的错误。请帮我弄清楚。

20/08/24 15:55:46 WARN Hive: Failed to access metastore. This class should not accessed in     runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at   sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at   org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:271)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)

Spark代码

package com.yotpo.saudie
 import java.io.File
 import org.apache.spark.sql.{Row, SaveMode, SparkSession}
 import org.apache.log4j.{Level, LogManager, Logger}
 import org.apache.spark
 import org.apache.spark.SparkContext
 import org.apache.spark.sql.SparkSession

object SaudieTester {
def main(args: Array[String]) {

// Set the log level to only print errors
Logger.getLogger("org").setLevel(Level.ERROR)

// Use new SparkSession interface in Spark 2.0
val spark = SparkSession
  .builder()
  .appName("SparkSQL")
  .master("local[*]")
  .config("spark.sql.warehouse.dir", "saudie/spark-warehouse")
  // Necessary to work around a Windows bug in Spark 2.0.0; omit if you're not on Windows.
  .enableHiveSupport()
  .getOrCreate()

import spark.implicits._
import spark.sql
case class Record(key: Int, value: String)
val jdbcDF = spark.read
  .format("jdbc")
  .option("url", "jdbc:mysql://ec2.compute.amazonaws.com:3306")
  .option("dbtable", "restaurant.cart")
  .option("user", "root")
  .option("password", "ROOT")
  .option("driver","com.mysql.jdbc.Driver")
  .load()

spark.sqlContext.setConf("hive.exec.dynamic.partition", "true")
val df1 = jdbcDF.toDF()
df1.show()
df1.createOrReplaceTempView("recordstable")
df1.write.partitionBy("key").format("hive").saveAsTable("recordstable")

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题