hive表数据与mapreduce

zlwx9yxi  于 2021-05-30  发布在  Hadoop
关注(0)|答案(3)|浏览(344)

我在linux(centos)中执行mapreduce作业时遇到以下错误。我在classpath中添加了所有jar。数据库名和表名已在配置单元数据库中,表中有某列数据。然后我也无法从配置单元数据库表访问数据。
我在用香草版的hadoop工作。我是否需要按mysql驱动程序路径、用户名和密码编辑hive-site.xml文件?。如果是,请告诉我为配置单元添加用户名和密码的过程。先谢谢你

murali]# hadoop jar /home/murali/workspace/hadoop/HiveInputForMapper/target/HiveInputForMapper-0.0.1-SNAPSHOT.jar com.cosmonet.HiveInputDriver -libjars $LIBJARS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/murali/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Java HotSpot(TM) Server VM warning: You have loaded library /hadoop/hadoop/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/11/20 18:21:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/11/20 18:21:05 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
14/11/20 18:21:05 INFO metastore.ObjectStore: ObjectStore, initialize called
14/11/20 18:21:05 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
14/11/20 18:21:05 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
14/11/20 18:21:07 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
14/11/20 18:21:07 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after : "".
14/11/20 18:21:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:08 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
14/11/20 18:21:09 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
14/11/20 18:21:09 INFO metastore.ObjectStore: Initialized ObjectStore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: Added admin role in metastore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: Added public role in metastore
14/11/20 18:21:09 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
14/11/20 18:21:09 INFO metastore.HiveMetaStore: 0: get_databases: NonExistentDatabaseUsedForHealthCheck
14/11/20 18:21:09 INFO HiveMetaStore.audit: ugi=root	ip=unknown-ip-addr	cmd=get_databases: NonExistentDatabaseUsedForHealthCheck	
14/11/20 18:21:09 INFO metastore.HiveMetaStore: 0: get_table : db=bigdata tbl=categories
14/11/20 18:21:09 INFO HiveMetaStore.audit: ugi=root	ip=unknown-ip-addr	cmd=get_table : db=bigdata tbl=categories	
Exception in thread "main" java.io.IOException: NoSuchObjectException(message:bigdata.categories table not found)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:71)
	at com.cosmonet.HiveInputDriver.run(HiveInputDriver.java:27)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
	at com.cosmonet.HiveInputDriver.main(HiveInputDriver.java:49)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: NoSuchObjectException(message:bigdata.categories table not found)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1560)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
	at com.sun.proxy.$Proxy9.get_table(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
	at org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:191)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	... 10 more
14/11/20 18:23:10 INFO metastore.HiveMetaStore: 1: Shutting down the object store...
14/11/20 18:23:10 INFO HiveMetaStore.audit: ugi=root	ip=unknown-ip-addr	cmd=Shutting down the object store...	
14/11/20 18:23:10 INFO metastore.HiveMetaStore: 1: Metastore shutdown complete.
14/11/20 18:23:10 INFO HiveMetaStore.audit: ugi=root	ip=unknown-ip-addr	cmd=Metastore shutdown complete
62lalag4

62lalag41#

我猜框架找不到正确的配置单元元存储。尝试像这样提供配置单元配置文件。hadoop—配置$hive\u home/conf

q3aa0525

q3aa05252#

您的hadoop安装似乎包含多个slf4j绑定,删除其中一个绑定可能会解决问题。在导致冲突的依赖项中添加以下排除。

<exclusions>
    <exclusion>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-log4j12</artifactId>
    </exclusion>
</exclusions>
bprjcwpo

bprjcwpo3#

这解决了我的问题。运行mapreduce时,通过--config选项提供配置单元的conf目录 hadoop --config $HIVE_HOME/conf jar jar_name MainClass args

相关问题