如何使用flume将csv(逗号分隔)文件加载到hbase表中?

qlfbtfca  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(237)

我想加载一个csv(逗号分隔)文件到我的hbase表。我已经在google的一些文章中尝试过了,现在我可以将整行(或行)作为值加载到hbase中,也就是说,单行中的所有值都存储为单列,但是我想根据分隔符逗号(,)拆分行,并将这些值存储到hbase表的列族中的不同列中。
请帮忙解决我的问题。如有任何建议,我们将不胜感激。
下面是我目前使用的输入文件、代理配置文件和hbase输出文件。

1)input file

8600000US00601,00601,006015-DigitZCTA,0063-DigitZCTA,11102
8600000US00602,00602,006025-DigitZCTA,0063-DigitZCTA,12869
8600000US00603,00603,006035-DigitZCTA,0063-DigitZCTA,12423
8600000US00604,00604,006045-DigitZCTA,0063-DigitZCTA,33548
8600000US00606,00606,006065-DigitZCTA,0063-DigitZCTA,10603

2)agent configuration file

agent.sources  = spool
agent.channels = fileChannel2
agent.sinks    = sink2

agent.sources.spool.type = spooldir
agent.sources.spool.spoolDir = /home/cloudera/Desktop/flume
agent.sources.spool.fileSuffix = .completed
agent.sources.spool.channels = fileChannel2

# agent.sources.spool.deletePolicy = immediate

agent.sinks.sink2.type = org.apache.flume.sink.hbase.HBaseSink
agent.sinks.sink2.channel = fileChannel2
agent.sinks.sink2.table = sample
agent.sinks.sink2.columnFamily = s1
agent.sinks.sink2.serializer = org.apache.flume.sink.hbase.RegexHbaseEventSerializer
agent.sinks.sink1.serializer.regex = "\"([^\"]+)\""
agent.sinks.sink2.serializer.regexIgnoreCase = true
agent.sinks.sink1.serializer.colNames =col1,col2,col3,col4,col5
agent.sinks.sink2.batchSize = 100
agent.channels.fileChannel2.type=memory

3)HBase output 

hbase(main):009:0> scan 'sample'
ROW                                         COLUMN+CELL                                                                                                                 
 1431064328720-0LalKGmSf3-1                 column=s1:payload, timestamp=1431064335428, value=8600000US00602,00602,006025-DigitZCTA,0063-DigitZCTA,12869                
 1431064328720-0LalKGmSf3-2                 column=s1:payload, timestamp=1431064335428, value=8600000US00603,00603,006035-DigitZCTA,0063-DigitZCTA,12423                
 1431064328720-0LalKGmSf3-3                 column=s1:payload, timestamp=1431064335428, value=8600000US00604,00604,006045-DigitZCTA,0063-DigitZCTA,33548                
 1431064328721-0LalKGmSf3-4                 column=s1:payload, timestamp=1431064335428, value=8600000US00606,00606,006065-DigitZCTA,0063-DigitZCTA,10603                
4 row(s) in 0.0570 seconds

hbase(main):010:0>

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题