从oozie工作流执行配置单元查询时出现表不存在异常(e0729)

kpbpu008  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(419)

脚本\u susres.q

  1. select * from ufo_session_details limit 5

工作流\u susres.xml

  1. <?xml version="1.0" encoding="UTF-8"?>
  2. <workflow-app xmlns="uri:oozie:workflow:0.4" name="hive-wf">
  3. <start to="hive-node"/>
  4. <action name="hive-node">
  5. <hive xmlns="uri:oozie:hive-action:0.2">
  6. <job-tracker>${jobTracker}</job-tracker>
  7. <name-node>${nameNode}</name-node>
  8. <configuration>
  9. <property>
  10. <name>mapred.job.queue.name</name>
  11. <value>default</value>
  12. </property>
  13. </configuration>
  14. <script>Script_SusRes.q</script>
  15. </hive>
  16. <ok to="end"/>
  17. <error to="fail"/>
  18. </action>
  19. <kill name="fail">
  20. <message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  21. </kill>
  22. <end name="end"/>
  23. </workflow-app>

存储属性

  1. oozieClientUrl=http://zltv5636.vci.att.com:11000/oozie
  2. nameNode=hdfs://zltv5635.vci.att.com:8020
  3. jobTracker=zltv5636.vci.att.com:50300
  4. queueName=default
  5. userName=wfe
  6. oozie.use.system.libpath=true
  7. oozie.libpath = ${nameNode}/tmp/nt283s
  8. oozie.wf.application.path=/tmp/nt283s/workflow_SusRes.xml

错误日志

  1. Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001] Oozie Launcher failed, finishing Hadoop job gracefully

oozie启动程序结束
标准日志

  1. Logging initialized using configuration in file:/opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/hive-log4j.properties
  2. FAILED: SemanticException [Error 10001]: Line 1:14 Table not found 'ufo_session_details'
  3. Intercepting System.exit(10001)
  4. Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.HiveMain], exit code [10001]

系统日志

  1. 2015-11-03 00:26:20,599 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded the native-hadoop library
  2. 2015-11-03 00:26:20,902 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/8045442539840332845_326451332_1282624021/zltv5635.vci.att.com/tmp/nt283s/Script_SusRes.q <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/Script_SusRes.q
  3. 2015-11-03 00:26:20,911 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/3435440518513182209_187825668_1219418250/zltv5635.vci.att.com/tmp/nt283s/Script_SusRes.sql <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/Script_SusRes.sql
  4. 2015-11-03 00:26:20,913 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/-5883507949569818012_2054276612_1203833745/zltv5635.vci.att.com/tmp/nt283s/lib <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/lib
  5. 2015-11-03 00:26:20,916 INFO org.apache.hadoop.mapred.TaskRunner: Creating symlink: /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/distcache/6682880817470643170_1186359172_1225814386/zltv5635.vci.att.com/tmp/nt283s/workflow_SusRes.xml <- /opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/workflow_SusRes.xml
  6. 2015-11-03 00:26:21,441 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
  7. 2015-11-03 00:26:21,448 INFO org.apache.hadoop.mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@698cdde3
  8. 2015-11-03 00:26:21,602 INFO org.apache.hadoop.mapred.MapTask: Processing split: hdfs://zltv5635.vci.att.com:8020/user/wfe/oozie-oozi/0000088-151013062722898-oozie-oozi-W/hive-node--hive/input/dummy.txt:0+5
  9. 2015-11-03 00:26:21,630 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: Loaded native gpl library
  10. 2015-11-03 00:26:21,635 INFO com.hadoop.compression.lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
  11. 2015-11-03 00:26:21,652 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library is available
  12. 2015-11-03 00:26:21,652 INFO org.apache.hadoop.io.compress.snappy.LoadSnappy: Snappy native library loaded
  13. 2015-11-03 00:26:21,663 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 0
  14. 2015-11-03 00:26:22,654 INFO SessionState:
  15. Logging initialized using configuration in file:/opt/app/workload/hadoop/mapred/local/taskTracker/wfe/jobcache/job_201510130626_0451/attempt_201510130626_0451_m_000000_0/work/hive-log4j.properties
  16. 2015-11-03 00:26:22,910 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=Driver.run>
  17. 2015-11-03 00:26:22,911 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=TimeToSubmit>
  18. 2015-11-03 00:26:22,912 INFO org.apache.hadoop.hive.ql.Driver: <PERFLOG method=compile>
  19. 2015-11-03 00:26:22,998 INFO hive.ql.parse.ParseDriver: Parsing command: select * from ufo_session_details limit 5
  20. 2015-11-03 00:26:23,618 INFO hive.ql.parse.ParseDriver: Parse Completed
  21. 2015-11-03 00:26:23,799 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Starting Semantic Analysis
  22. 2015-11-03 00:26:23,802 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Completed phase 1 of Semantic Analysis
  23. 2015-11-03 00:26:23,802 INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer: Get metadata for source tables
  24. 2015-11-03 00:26:23,990 INFO org.apache.hadoop.hive.metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
  25. 2015-11-03 00:26:24,031 INFO org.apache.hadoop.hive.metastore.ObjectStore: ObjectStore, initialize called
  26. 2015-11-03 00:26:24,328 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
  27. 2015-11-03 00:26:28,112 INFO org.apache.hadoop.hive.metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  28. 2015-11-03 00:26:28,169 INFO org.apache.hadoop.hive.metastore.ObjectStore: Initialized ObjectStore
  29. 2015-11-03 00:26:30,767 INFO org.apache.hadoop.hive.metastore.HiveMetaStore: 0: get_table : db=default tbl=ufo_session_details
  30. 2015-11-03 00:26:30,768 INFO org.apache.hadoop.hive.metastore.HiveMetaStore.audit: ugi=wfe ip=unknown-ip-addr cmd=get_table : db=default tbl=ufo_session_details
  31. 2015-11-03 00:26:30,781 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  32. 2015-11-03 00:26:30,782 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  33. 2015-11-03 00:26:33,319 ERROR org.apache.hadoop.hive.metastore.RetryingHMSHandler: NoSuchObjectException(message:default.ufo_session_details table not found)
  34. at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1380)
  35. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  36. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  37. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  38. at java.lang.reflect.Method.invoke(Method.java:606)
  39. at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:102)
  40. at com.sun.proxy.$Proxy11.get_table(Unknown Source)
  41. at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:836)
  42. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  43. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  44. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  45. at java.lang.reflect.Method.invoke(Method.java:606)
  46. at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
  47. at com.sun.proxy.$Proxy12.getTable(Unknown Source)
  48. at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:945)
  49. at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:887)
  50. at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1083)
  51. at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1059)
  52. at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8680)
  53. at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)
  54. at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433)
  55. at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
  56. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
  57. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
  58. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
  59. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
  60. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:348)
  61. at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:446)
  62. at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:456)
  63. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
  64. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
  65. at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:261)
  66. at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:238)
  67. at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
  68. at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:49)
  69. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  70. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  71. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  72. at java.lang.reflect.Method.invoke(Method.java:606)
  73. at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:491)
  74. at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
  75. at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
  76. at org.apache.hadoop.mapred.MapTask.run(MapTask.java:365)
  77. at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  78. at java.security.AccessController.doPrivileged(Native Method)
  79. at javax.security.auth.Subject.doAs(Subject.java:415)
  80. at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
  81. at org.apache.hadoop.mapred.Child.main(Child.java:249)
wz1wpwve

wz1wpwve1#

1677[main]info org.apache.hadoop.hive.ql.driver-1679[main]info org.apache.hadoop.hive.ql.driver-1680[main]info org.apache.hadoop.hive.ql.driver-1771[main]info hive.ql.parse.parsedriver-解析命令:select*from ufo\u session\u master limit 5 2512[main]info hive.ql.parse.parsedriver-解析完成2683[main]infoorg.apache.hadoop.hive.ql.parse.semanticalyzer-开始语义分析2686[main]info org.apache.hadoop.hive.ql.parse.semanticalyzer-完成语义分析的第一阶段2686[main]info org.apache.hadoop.hive.ql.parse.semanticalyzer-获取源表2831[main]info hive.metastore的元数据-尝试连接到元存储具有urithrift://zltv5636.vci.att.com:9083 2952[main]warn hive.metastore-无法连接到metastore服务器。。。2952[main]info hive.metastore-在下一次连接尝试之前等待1秒钟。3952[main]info hive.metastore-正在尝试连接到uri为的metastorethrift://zltv5636.vci.att.com:9083 3959[main]warn hive.metastore-无法连接到metastore服务器。。。3960[main]info hive.metastore-在下一次连接尝试之前等待1秒钟。4960[main]info hive.metastore-正在尝试连接到uri为的metastorethrift://zltv5636.vci.att.com:9083 4967[main]warn hive.metastore-无法连接到metastore服务器。。。4967[main]info hive.metastore-在下一次连接尝试之前等待1秒钟。5978[main]error org.apache.hadoop.hive.ql.parse.semanticanalyzer-org.apache.hadoop.hive.ql.metadata.hiveexception:无法获取表ufo\u session\u master

相关问题