使用sqoop2将数据导入hdfs

b1uwtaje  于 2021-06-03  发布在  Hadoop
关注(0)|答案(2)|浏览(398)

根据官方指南,http://sqoop.apache.org/docs/1.99.2/sqoop5minutesdemo.html ,我成功创建了一个作业。
然而,当我执行命令时, submission start --jid 1 ,我收到以下错误消息:

Exception has occurred during processing command 
Server has returned exception: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement

这是我的工作信息。
数据库配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query:

输出配置

Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log

限制资源

Extractors: 
Loaders:

由于官方指南中没有关于如何设置上述值的信息,有人知道我的工作设置有什么问题吗?
这是日志:

Stack trace:
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:59)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
Caused by: Exception: java.lang.Throwable Message: ERROR: schema "invoice" does not exist
  Position: 46
Stack trace:
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:2102)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:1835)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:257)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:500)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:374)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:254)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:56)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)
pexxcrt2

pexxcrt21#

“table column names”中的值“*”不是必需的,因为默认值是“all the columns”。如果您可以共享服务器日志来查看出错的地方,这也会很有帮助。
通过将shell切换到详细模式,可以获得其他信息,例如异常的整个堆栈跟踪:

set option --name verbose --value true
e0bqpujr

e0bqpujr2#

表列名:*
不能使用*,请改用逗号分隔的列名。您应该指定一个列名作为分区列,您可以使用任何列作为分区(用于将导入作业分离/断开到多个任务以进行并行处理)。可以将未记录的参数保留为null。给出选择hdfs(存储)和文件格式(序列文件/文本文件)的整数。
下面是创建的示例作业(show job--jid yourjob id)
sqoop:000>显示作业--jid 146
1个作业显示:
id为146且名称为importjob的作业(创建于2013年10月10日下午3:46,更新于2013年10月10日下午3:46)
使用连接id 149和连接器id 1
数据库配置

Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query:

输出配置

Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/

限制资源

Extractors: 

Loaders:

以下是我的sqoop java客户端博客:
http://devslogics.blogspot.in/2013/09/sqoop-java-client.html

相关问题