如何阻止spark设置新的Hive版本

tpgth1q7  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(304)

我的hive metastore verison是2.1.0。但当我启动sparkshell时,它会将版本更新为1.2.0。

17/06/11 12:04:03 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/root/spark-2.1.1-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/root/spark/jars/datanucleus-core-3.2.10.jar."
17/06/11 12:04:07 ERROR metastore.ObjectStore: Version information found in metastore differs 2.1.0 from expected schema version 1.2.0. Schema verififcation is disabled hive.metastore.schema.verification so setting version.
17/06/11 12:04:09 WARN metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException

这导致我的Hive停止工作。我已经尝试在spark defaults.conf中设置spark.sql.hive.metastore.version 2.1.0…然后我的spark shell不起作用。请帮帮我

hgb9j2n6

hgb9j2n61#

您应该能够通过更新hive-site.xml来禁用版本验证

<name>hive.metastore.schema.verification</name>
    <!-- <value>true</value> -->
    <value>false</value>
    <description>
      Enforce metastore schema version consistency.
      True: Verify that version information stored in is compatible with one from Hive jars.  Also disable automatic
            schema migration attempt. Users are required to manually migrate schema after Hive upgrade which ensures
            proper metastore schema migration. (Default)
      False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
    </description>
  </property>
  <property>

相关问题