spark1.6和hive0.14集成问题

wj8zmpe1  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(283)

我一直在尝试将最新的spark 1.6与hive 0.14.0集成。我只是想让储蓄服务器运行起来。我注意到,如果不重写以下配置:(-conf spark.sql.hive.metastore.version=0.14.0--conf spark.sql.hive.metastore.jars=maven)调用start-thrifstserver.sh spark脚本时,然后,由于spark 1.6默认使用的配置单元1.2.1与我在prod中运行的配置单元版本之间的不兼容问题,任何create table查询都会在spark中失败。但是,当我重写这两个配置时,当thrift server启动时,它没有像hive-site.xml中指定的那样连接到我的配置单元元存储uri,但是它连接到derby数据库时很累,然后thrift服务器就不能正常启动。我遗漏了一些额外的覆盖吗?
请参阅下面的thrift服务器日志信息:

  1. Loaded from file:/usr/lib/spark/lib/spark-assembly-1.6.0-hadoop2.6.0.jar
  2. java.vendor=Oracle Corporation
  3. java.runtime.version=1.7.0_79-b15
  4. user.dir=/
  5. os.name=Linux
  6. os.arch=amd64
  7. os.version=2.6.32-504.23.4.el6.x86_64
  8. derby.system.home=null
  9. Database Class Loader started - derby.database.classpath=''
  10. 16/01/26 16:35:20 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (10.15.150.38:51475) with ID 20
  11. 16/01/26 16:35:20 INFO BlockManagerMasterEndpoint: Registering block manager 10.15.150.38:52107 with 9.9 GB RAM, BlockManagerId(20, 10.15.150.38, 52107)
  12. 16/01/26 16:35:20 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (10.15.150.38:51479) with ID 48
  13. 16/01/26 16:35:20 INFO BlockManagerMasterEndpoint: Registering block manager 10.15.150.38:47973 with 9.9 GB RAM, BlockManagerId(48, 10.15.150.38, 47973)
  14. 16/01/26 16:35:20 WARN Configuration: org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@3cf4a477:an attempt to override final parameter: mapreduce.reduce.speculative; Ignoring.
  15. 16/01/26 16:35:20 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  16. 16/01/26 16:35:21 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  17. 16/01/26 16:35:21 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  18. 16/01/26 16:35:22 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  19. 16/01/26 16:35:22 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  20. 16/01/26 16:35:22 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
  21. 16/01/26 16:35:22 INFO ObjectStore: Initialized ObjectStore
  22. 16/01/26 16:35:22 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
  23. 16/01/26 16:35:22 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
  24. 16/01/26 16:35:22 INFO HiveMetaStore: Added admin role in metastore
  25. 16/01/26 16:35:22 INFO HiveMetaStore: Added public role in metastore
  26. 16/01/26 16:35:22 INFO HiveMetaStore: No user is added in admin role, since config is empty
  27. 16/01/26 16:35:22 INFO HiveMetaStore: 0: get_all_databases
  28. 16/01/26 16:35:22 INFO audit: ugi=hive ip=unknown-ip-addr cmd=get_all_databases
  29. 16/01/26 16:35:22 INFO HiveMetaStore: 0: get_functions: db=default pat=*
  30. 16/01/26 16:35:22 INFO audit: ugi=hive ip=unknown-ip-addr cmd=get_functions: db=default pat=*
  31. 16/01/26 16:35:22 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
  32. 16/01/26 16:35:22 INFO SessionState: Created local directory: /tmp/06895c7e-e26c-42b7-b100-4222d0356b6b_resources
  33. 16/01/26 16:35:22 INFO SessionState: Created HDFS directory: /tmp/hive/hive/06895c7e-e26c-42b7-b100-4222d0356b6b
  34. 16/01/26 16:35:22 INFO SessionState: Created local directory: /tmp/hive/06895c7e-e26c-42b7-b100-4222d0356b6b
  35. 16/01/26 16:35:23 INFO SessionState: Created HDFS directory: /tmp/hive/hive/06895c7e-e26c-42b7-b100-4222d0356b6b/_tmp_space.db
  36. 16/01/26 16:35:23 WARN Configuration: org.apache.hadoop.hive.conf.LoopingByteArrayInputStream@37f031a:an attempt to override final parameter: mapreduce.reduce.speculative; Ignoring.
  37. 16/01/26 16:35:23 INFO HiveContext: default warehouse location is /user/hive/warehouse
  38. 16/01/26 16:35:23 INFO HiveContext: Initializing HiveMetastoreConnection version 0.14.0 using maven.
  39. Ivy Default Cache set to: /home/hive/.ivy2/cache
  40. The jars for the packages stored in: /home/hive/.ivy2/jars
  41. http://www.datanucleus.org/downloads/maven2 added as a remote repository with the name: repo-1
  42. :: loading settings :: url = jar:file:/usr/lib/spark/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
  43. org.apache.calcite#calcite-core added as a dependency
  44. org.apache.calcite#calcite-avatica added as a dependency
  45. org.apache.hive#hive-metastore added as a dependency
  46. org.apache.hive#hive-exec added as a dependency
  47. org.apache.hive#hive-common added as a dependency
  48. org.apache.hive#hive-serde added as a dependency
  49. com.google.guava#guava added as a dependency
  50. org.apache.hadoop#hadoop-client added as a dependency
  51. :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
  52. confs: [default]

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题