pig-join不起作用

2uluyalo  于 2021-06-04  发布在  Hadoop
关注(0)|答案(3)|浏览(340)

我有一个问题加入Pig。我先给你讲讲上下文。这是我的密码:

-- START file loading
start_file = LOAD 'dir/start_file.csv' USING PigStorage(';') as (PARTRANGE:chararray,     COD_IPUSER:chararray);

-- trim
A = FOREACH start_file GENERATE TRIM(PARTRANGE) AS PARTRANGE, TRIM(COD_IPUSER) AS COD_IPUSER;

dump A;

输出:

(79.92.147.88,20140310)
(79.92.147.88,20140310)
(109.31.67.3,20140310)
(109.31.67.3,20140310)
(109.7.229.143,20140310)
(109.8.114.133,20140310)
(77.198.79.99,20140310)
(77.200.174.171,20140310)
(77.200.174.171,20140310)
(109.17.117.212,20140310)

正在加载其他文件:

-- Chargement du fichier recherche Hadopi
file2 = LOAD 'dir/file2.csv' USING PigStorage(';') as (IP_RECHERCHEE:chararray, DATE_HADO:chararray);

dump file2;

输出如下:

(2014/03/10 00:00:00,79.92.147.88)
(2014/03/10 00:00:01,79.92.147.88)
(2014/03/10 00:00:00,192.168.2.67)

现在,我想做一个左外连接。代码如下:

result = JOIN file2 by IP_RECHERCHEE LEFT OUTER, A by COD_IPUSER;
dump result;

输出如下:

(2014/03/10 00:00:00,79.92.147.88,,)
(2014/03/10 00:00:00,192.168.2.67,,)
(2014/03/10 00:00:01,79.92.147.88,,)

“file2”的所有记录都在这里,这很好,但是start\u文件中的任何一个都在这里。就像连接失败一样。
你知道问题出在哪里吗?
谢谢。

bwleehnv

bwleehnv1#

您的字段名称错误,并且您通过错误的字段加入。似乎你想通过ip地址加入。

start_file = LOAD 'dir/start_file.csv' USING PigStorage(';') as (IP:chararray, PARTRANGE:chararray);

A = FOREACH start_file GENERATE TRIM(IP) AS IP, TRIM(PARTRANGE) AS PARTRANGE;

file2 = LOAD 'dir/file2.csv' USING PigStorage(';') as (DATE_HADO:chararray, IP:chararray);

我得到的是这个

(2014/03/10 00:00:00,192.168.2.67,,)
(2014/03/10 00:00:00,79.92.147.88,79.92.147.88,20140310)
(2014/03/10 00:00:00,79.92.147.88,79.92.147.88,20140310)
(2014/03/10 00:00:01,79.92.147.88,79.92.147.88,20140310)
(2014/03/10 00:00:01,79.92.147.88,79.92.147.88,20140310)
yjghlzjz

yjghlzjz2#

结果如预期。您正在调用left outer join,它查找file2中的ip\u recherchee字段与a的cod\u ipuser的匹配。

由于没有匹配项,它将返回file2中的所有ip\u recherchee字段,并将null替换a中的字段

很明显 2014/03/10 00:00:00 != 20140310

rta7y2nd

rta7y2nd3#

你把你的领域标错了 file2 . 您将第一个字段称为ip,第二个字段称为日期、时间,如所示 dump ,情况正好相反。尝试 FOREACH file2 GENERATE IP_RECHERCHEE 你会看到你想要加入的领域。

相关问题