使用python pyspark将特定列转换为行

zmeyuzjn  于 2021-07-13  发布在  Spark
关注(0)|答案(1)|浏览(316)

我有一个例子:

+---+---------+----+---------+---------+-------------+-------------+
|id | company |type|rev2016  | rev2017 | main2016    |  main2017   |
+---+---------+----+---------+---------+-------------+-------------+
| 1 | google  |web | 100     |  200    |  55         |     66      |
+---+---------+----+---------+---------+-------------+-------------+

我想要这个输出:

+---+---------+----+-------------+------+------+
|id | company |type| Metric      | 2016 | 2017 |   
+---+---------+----+-------------+------+------+
| 1 | google  |web | rev         | 100  |  200 |   
| 1 | google  |web | main        |  55  |  66  |  
+---+---------+----+-------------+------+------+

我试图实现的是将revenue和maintenance列转换为具有新列“metric”的行。到目前为止我还没找到运气。

v7pvogib

v7pvogib1#

可以从列构造一个结构数组,然后分解数组并展开结构以获得所需的输出。

import pyspark.sql.functions as F

struct_list = [
    F.struct(
        F.lit('rev').alias('Metric'),
        F.col('rev2016').alias('2016'),
        F.col('rev2017').alias('2017')
    ),
    F.struct(
        F.lit('main').alias('Metric'),
        F.col('main2016').alias('2016'),
        F.col('main2017').alias('2017')
    )
]

df2 = df.withColumn(
    'arr',
    F.explode(F.array(*struct_list))
).select('id', 'company', 'type', 'arr.*')

df2.show()
+---+-------+----+------+----+----+
| id|company|type|Metric|2016|2017|
+---+-------+----+------+----+----+
|  1| google| web|   rev| 100| 200|
|  1| google| web|  main|  55|  66|
+---+-------+----+------+----+----+

或者你可以用 stack :

df2 = df.selectExpr(
    'id', 'company', 'type',
    "stack(2, 'rev', rev2016, rev2017, 'main', main2016, main2017) as (Metric, `2016`, `2017`)"
)

df2.show()
+---+-------+----+------+----+----+
| id|company|type|Metric|2016|2017|
+---+-------+----+------+----+----+
|  1| google| web|   rev| 100| 200|
|  1| google| web|  main|  55|  66|
+---+-------+----+------+----+----+

相关问题