更新pysparkDataframe中数组的值

whitzsjs  于 2021-07-13  发布在  Spark
关注(0)|答案(3)|浏览(246)

我想检查pyspark dataframe中数组的最后两个值是否为 [1, 0] 并将其更新为 [1, 1] 输入Dataframe

Column1    Array_column
abc        [0,1,1,0]
def        [1,1,0,0]
adf        [0,0,1,0]

输出Dataframe

Column1    Array_column
abc        [0,1,1,1]
def        [1,1,0,0]
adf        [0,0,1,1]
bfrts1fy

bfrts1fy1#

你可以切片数组,做一个 case when 对于最后两个元素,并使用 concat .

import pyspark.sql.functions as F

df2 = df.withColumn(
    'Array_column',
    F.expr("""
        concat(
            slice(Array_column, 1, size(Array_column) - 2),
            case when slice(Array_column, size(Array_column) - 1, 2) = array(1,0) 
                 then array(1,1)
                 else slice(Array_column, size(Array_column) - 1, 2)
            end
         )
    """)
)

df2.show()
+-------+------------+
|Column1|Array_column|
+-------+------------+
|    abc|[0, 1, 1, 1]|
|    def|[1, 1, 0, 0]|
|    adf|[0, 0, 1, 1]|
+-------+------------+
11dmarpk

11dmarpk2#

可以将数组函数与when表达式组合使用:

from pyspark.sql import functions as F

df1 = df.withColumn(
    "Array_column",
    F.when(
        F.slice("Array_column", -2, 2) == F.array(F.lit(1), F.lit(0)),
        F.flatten(F.array(F.expr("slice(Array_column, 1, size(Array_column) - 2)"), F.array(F.lit(1), F.lit(1))))
    ).otherwise(F.col("Array_column"))
)

df1.show()

# +-------+------------+

# |Column1|Array_column|

# +-------+------------+

# |    abc|[0, 1, 1, 1]|

# |    def|[1, 1, 0, 0]|

# |    adf|[0, 0, 1, 1]|

# +-------+------------+
aemubtdh

aemubtdh3#

>>> def udf1(i):
      if (i[2]==1) & (i[3]==0):
       i[3]=1
      else:
        i[3]=i[3]
      return i

>>> udf2=udf(udf1)
df1.withColumn("Array_Column",udf2(col("Array_Column"))).show()

+-------+------------+
|Column1|Array_Column|
+-------+------------+
|    abc|[0, 1, 1, 1]|
|    def|[1, 1, 0, 0]|
|    adf|[0, 0, 1, 1]|
+-------+------------+

相关问题