scala—如何根据sparkDataframe中另一列的值更改列的值

qybjjes1  于 2021-05-27  发布在  Spark
关注(0)|答案(2)|浏览(873)

我从这个Dataframe开始

DF1
+----+-------+-------+-------+
|name | type  |item1 | item2 |
+-----+-------+------+-------+
|apple|fruit  |apple1|apple2 |
|beans|vege   |beans1|beans2 |
|beef |meat   |beef1 |beef2  |
|kiwi |fruit  |kiwi1 |kiwi2  |
|pork |meat   |pork1 |pork2  |
+-----+-------+--------------+

现在我想根据df2中“type”列的列值填充一个名为“prop”的列。例如,

If "type"== "fruit" then "prop"="item1"
If "type"== "vege" then "prop"="item1"
If "type"== "meat" then "prop"="item2"

最好的办法是什么?我在考虑根据每个“类型”进行过滤,填充“prop”列,然后连接生成的Dataframe。那似乎效率不高。

DF2
+----+-------+-------+-------+-------+
|name | type  |item1 | item2 | prop  |
+-----+-------+------+-------+-------+
|apple|fruit  |apple1|apple2 |apple1 |
|beans|vege   |beans1|beans2 |beans1 |
|beef |meat   |beef1 |beef2  |beef2  |
|kiwi |fruit  |kiwi1 |kiwi2  |kiwi1  |
|pork |meat   |pork1 |pork2  |pork2  |
+-----+-------+--------------+-------+
nmpmafwu

nmpmafwu1#

使用 when+otherwise 这种情况下的声明是非常有效的Spark。

//sample data
df.show()
//+-----+-----+------+------+
//| name| type| item1| item2|
//+-----+-----+------+------+
//|apple|fruit|apple1|apple2|
//|beans| vege|beans1|beans2|
//| beef| meat| beef1| beef2|
//| kiwi|fruit| kiwi1| kiwi2|
//| pork| meat| pork1| pork2|
//+-----+-----+------+------+

//using isin function
df.withColumn("prop",when((col("type").isin(Seq("vege","fruit"):_*)),col("item1")).when(col("type") === "meat",col("item2")).otherwise(col("type"))).show()

df.withColumn("prop",when((col("type") === "fruit") ||(col("type") === "vege"),col("item1")).when(col("type") === "meat",col("item2")).
otherwise(col("type"))).
show()
//+-----+-----+------+------+------+
//| name| type| item1| item2|  prop|
//+-----+-----+------+------+------+
//|apple|fruit|apple1|apple2|apple1|
//|beans| vege|beans1|beans2|beans1|
//| beef| meat| beef1| beef2| beef2|
//| kiwi|fruit| kiwi1| kiwi2| kiwi1|
//| pork| meat| pork1| pork2| pork2|
//+-----+-----+------+------+------+
qyyhg6bp

qyyhg6bp2#

它可以通过链接来完成 when 以及 otherwise 如下所示

import org.apache.spark.sql.functions._

object WhenThen {

  def main(args: Array[String]): Unit = {
    val spark = Constant.getSparkSess

    import spark.implicits._
    val df = List(("apple","fruit","apple1","apple2"),
      ("beans","vege","beans1","beans2"),
      ("beef","meat","beef1","beans2"),
      ("kiwi","fruit","kiwi1","beef2"),
      ("pork","meat","pork1","pork2")
    ).toDF("name","type","item1","item2" )

   df.withColumn("prop",
      when($"type" === "fruit", $"item1").otherwise(
        when($"type" === "vege", $"item1").otherwise(
          when($"type" === "meat", $"item2").otherwise("")
        )
      )).show()
  }

}

相关问题