环境:spark2.11 hive2.2 hadoop2.8.2
配置单元外壳运行成功!没有错误或警告。但运行application.sh时,启动失败
/usr/local/spark/bin/spark-submit \
--class cn.spark.sql.Demo \
--num-executors 3 \
--driver-memory 512m \
--executor-memory 512m \
--executor-cores 3 \
--files /usr/local/hive/conf/hive-site.xml \
--driver-class-path /usr/local/hive/lib/mysql-connector-java.jar \
/usr/local/java/sql/sparkstudyjava.jar \
错误提示:
Exception in thread "main" java.lang.IllegalArgumentException: Error while
instantiating 'org.apache.spark.sql.hive.HiveSessionState':
...
Caused by: java.lang.IllegalArgumentException: Error while instantiating
'org.apache.spark.sql.hive.HiveExternalCatalog':
...
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
...
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
...
Caused by: MetaException(message:Hive Schema version 1.2.0 does not match
metastore's schema version 2.1.0 Metastore is not upgraded or corrupt)
...
我尝试了很多方法来解决这个错误,但是错误仍然会发生。如何修复?
3条答案
按热度按时间hxzsmxv21#
可能是指另一个版本的配置单元(配置不同)。执行下面的命令并查看输出是否与/usr/local/hive不同。
如果两者都是相同的配置单元目录,请在hive-site.xml中添加以下属性。
omvjsjqw2#
我得到了类似的问题,因为spark&hive版本不匹配。spark 2.0分为1.2.0配置单元版本,我使用的默认配置单元是0.14.0。因此,在启动pyspark时通过传递版本可以解决问题。
pyspark--master yarn--num executors 1--executor memory 512m--conf spark.sql.hive.metastore.version=0.14.0--conf spark.sql.hive.metastore.jars=/usr/local/hive/apache-hive-0.14.0-bin/*
dfddblmv3#
有时spark中的配置单元jar可能与安装的配置单元版本不同,要处理这种情况,可以将jars和version in conf参数传递给spark job submit,例如--conf spark.sql.hive.metastore.version=2.3.0--conf spark.sql.hive.metastore.jars=/home/apache-hive-2.3.6-bin/lib/*