从配置单元2.1.1中的spark保存配置单元表

mbjcgjjk  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(339)

iam在ubuntu 16.04上使用spark 2.2和scala、Hive2.1.1和齐柏林飞艇。另外,我将hive-site.xml复制到spark/conf/中,并将mysql-connector-java.jar从hive/libs复制到spark/jars中
我想将Dataframe保存为hivetable,我正在使用:

  1. val hc = new org.apache.spark.sql.hive.HiveContext(sc)
  2. df.registerTempTable("myTempTable")
  3. hc.sql("create table store_sales as select * from myTempTable")

在我的笔记本上,我运行这个。

  1. %hive
  2. show tables;

我可以看到我的新HiveTableStore\u销售已经创建,但我不能在这之后运行hive。
这是我的hive-site.xml

  1. <configuration>
  2. <property>
  3. <name>javax.jdo.option.ConnectionURL</name>
  4. <value>jdbc:mysql://localhost/metastore?useSSL=false</value>
  5. <description>metadata is stored in a MySQL server</description>
  6. </property>
  7. <property>
  8. <name>javax.jdo.option.ConnectionDriverName</name>
  9. <value>com.mysql.jdbc.Driver</value>
  10. <description>MySQL JDBC driver class</description>
  11. </property>
  12. <property>
  13. <name>javax.jdo.option.ConnectionUserName</name>
  14. <value>hive</value>
  15. <description>user name for connecting to mysql server</description>
  16. </property>
  17. <property>
  18. <name>javax.jdo.option.ConnectionPassword</name>
  19. <value>hive</value>
  20. <description>password for connecting to mysql server</description>
  21. </property>
  22. <property>
  23. <name>hive.execution.engine</name>
  24. <value>spark</value>
  25. <description>set hive on spark</description>
  26. </property>
  27. </configuration>

当我跑 hive 的时候

  1. root@alex-bi:/usr/local/hive/bin# hive
  2. SLF4J: Class path contains multiple SLF4J bindings.
  3. SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  4. SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  5. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  6. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
  7. Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
  8. Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  9. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
  10. at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
  11. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
  12. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
  13. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  14. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  15. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  16. at java.lang.reflect.Method.invoke(Method.java:498)
  17. at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
  18. at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
  19. Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  20. at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
  21. at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
  22. at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
  23. at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
  24. at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
  25. at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558)
  26. ... 9 more
  27. Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  28. at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654)
  29. at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
  30. at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
  31. at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
  32. at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367)
  33. at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406)
  34. at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386)
  35. at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640)
  36. at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)
  37. at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)
  38. ... 14 more
  39. Caused by: java.lang.reflect.InvocationTargetException
  40. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  41. at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  42. at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  43. at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  44. at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
  45. ... 23 more
  46. Caused by: MetaException(message:Hive Schema version 2.1.0 does not match metastore's schema version 1.2.0 Metastore is not upgraded or corrupt)
  47. at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7768)
  48. at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:7731)
  49. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  50. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  51. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  52. at java.lang.reflect.Method.invoke(Method.java:498)
  53. at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
  54. at com.sun.proxy.$Proxy21.verifySchema(Unknown Source)
  55. at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:565)
  56. at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:626)
  57. at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416)
  58. at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
  59. at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
  60. at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490)
  61. at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238)
  62. at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
  63. ... 28 more

我认为这是位于spark/jars中的jar的问题

  1. hive-beeline-1.2.1.spark2.jar
  2. hive-cli-1.2.1.spark2.jar
  3. hive-exec-1.2.1.spark2.jar
  4. hive-jdbc-1.2.1.spark2.jar
  5. hive-metastore-1.2.1.spark2.jar

因为spark在hivetableversion1.2.1中默认保存,但是我不知道如何配置spark来保存hivetableversion2.1.1,这是我的本地配置单元安装。我把这些jar换成了hive/lib版本2.1.1的jar,但它不起作用,我不知道还能尝试什么,google也帮不上忙。
我需要这个,因为我想保存spark dataframe在hivetable中,在与kylin一起阅读了这个表之后,也许这不是正确的方法,还有另一种方法。
提前谢谢

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题