我的hive metastore verison是2.1.0。但当我启动sparkshell时,它会将版本更新为1.2.0。
17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar."
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
这导致我的Hive停止工作。我已经尝试在spark defaults.conf中设置spark.sql.hive.metastore.version 2.1.0…然后我的spark shell不起作用。请帮帮我
1条答案
按热度按时间hgb9j2n61#
您应该能够通过更新hive-site.xml来禁用版本验证