我正在尝试根据groupby键将多个列值合并到一个列中。基本上,我将使用spark1.6dataframeapi来创建嵌套的json。
样本输入表:-
a b c d e f g
---------------------------------------------
aa bb cc dd ee ff gg
aa bb cc1 dd1 ee1 ff1 gg1
aa bb cc2 dd2 ee2 ff2 gg2
aa1 bb1 cc3 dd3 ee3 ff3 gg3
aa1 bb1 cc4 dd4 ee4 ff4 gg4
a、b的最终输出组:-
aa bb {{cc,dd,ee,ff,gg},{cc1,dd1,ee1,ff1,gg1},{cc2,dd2,ee2,ff2,gg2}}
aa1 bb1 {{cc3,dd3,ee3,ff3,gg3},{cc4,dd4,ee4,ff4,gg4}}
我尝试使用collect\u list,但它只能对一列进行分组。不知道如何将多个列组合在一起。我试着使用concat字符串,然后对其使用collect,但我会丢失模式Map,因为我最终必须将其转储为json格式。以Map或结构的形式用棍棒敲打柱子也可以。请为这个问题提出一些优雅的方法/解决方案。感谢注意:使用spark 1.6
1条答案
按热度按时间zbwhf8kr1#
两个查询都使用
sqlContext.sql ("select ...");
```select a,b
,collect_list(array(c,d,e,f,g))
from abc
group by a,b
;
+-----+-----+----------------------------------------------------------------------------------------------+
| aa | bb | [["cc","dd","ee","ff","gg"],["cc1","dd1","ee1","ff1","gg1"],["cc2","dd2","ee2","ff2","gg2"]] |
+-----+-----+----------------------------------------------------------------------------------------------+
| aa1 | bb1 | [["cc3","dd3","ee3","ff3","gg3"],["cc4","dd4","ee4","ff4","gg4"]] |
+-----+-----+----------------------------------------------------------------------------------------------+
select a,b
,collect_list(struct(c,d,e,f,g))
from abc
group by a,b
;
+-----+-----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| aa | bb | [{"col1":"cc","col2":"dd","col3":"ee","col4":"ff","col5":"gg"},{"col1":"cc1","col2":"dd1","col3":"ee1","col4":"ff1","col5":"gg1"},{"col1":"cc2","col2":"dd2","col3":"ee2","col4":"ff2","col5":"gg2"}] |
+-----+-----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| aa1 | bb1 | [{"col1":"cc3","col2":"dd3","col3":"ee3","col4":"ff3","col5":"gg3"},{"col1":"cc4","col2":"dd4","col3":"ee4","col4":"ff4","col5":"gg4"}] |
+-----+-----+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
[cloudera@quickstart ~]$ spark-shell --version
Welcome to
____ __
/ / ___ __/ /
\ / _ / _ `/ __/ '/
// ./_,// //_\ version 1.6.0
//
Type --help for more information.
[cloudera@quickstart ~]$
[cloudera@quickstart ~]$ spark-shell
scala> sqlContext.sql("select * from abc").show;
+---+---+---+---+---+---+---+
| a| b| c| d| e| f| g|
+---+---+---+---+---+---+---+
| aa| bb| cc| dd| ee| ff| gg|
| aa| bb|cc1|dd1|ee1|ff1|gg1|
| aa| bb|cc2|dd2|ee2|ff2|gg2|
|aa1|bb1|cc3|dd3|ee3|ff3|gg3|
|aa1|bb1|cc4|dd4|ee4|ff4|gg4|
+---+---+---+---+---+---+---+
scala> sqlContext.sql("select a,b,collect_list(array(c,d,e,f,g)) from abc group by a,b").show;
+---+---+--------------------+
| a| b| _c2|
+---+---+--------------------+
|aa1|bb1|[[cc3, dd3, ee3, ...|
| aa| bb|[[cc, dd, ee, ff,...|
+---+---+--------------------+
scala> sqlContext.sql("select a,b,collect_list(struct(c,d,e,f,g)) from abc group by a,b").show;
+---+---+--------------------+
| a| b| _c2|
+---+---+--------------------+
|aa1|bb1|[[cc3,dd3,ee3,ff3...|
| aa| bb|[[cc,dd,ee,ff,gg]...|
+---+---+--------------------+