spark使用alias选择并添加列

c86crjj0  于 2021-06-01  发布在  Hadoop
关注(0)|答案(2)|浏览(685)

我想选择几列,添加几列或除法,用空格填充一些列,并用新名称作为别名存储它们。例如,在sql中应该是这样的:

select "   " as col1, b as b1, c+d as e from table

如何在spark中实现这一点?

flvtvl50

flvtvl501#

使用sparksql,您也可以这样做。

import org.apache.spark.sql.functions._
val df1 = Seq(
 ("A",1,5,3),
 ("B",3,4,2),
 ("C",4,6,3),
 ("D",5,9,1)).toDF("a","b","c","d")

df1.createOrReplaceTempView("table")
df1.show()

val df2 = spark.sql("select ' ' as col1, b as b1, c+d as e from table ").show()

输入:

+---+---+---+---+
    |  a|  b|  c|  d|
    +---+---+---+---+
    |  A|  1|  5|  3|
    |  B|  3|  4|  2|
    |  C|  4|  6|  3|
    |  D|  5|  9|  1|
    +---+---+---+---+

输出:

+----+---+---+
|col1| b1|  e|
+----+---+---+
|    |  1|  8|
|    |  3|  6|
|    |  4|  9|
|    |  5| 10|
+----+---+---+
iq3niunx

iq3niunx2#

也可以使用本机df函数。例如:

import org.apache.spark.sql.functions._
val df1 = Seq(
 ("A",1,5,3),
 ("B",3,4,2),
 ("C",4,6,3),
 ("D",5,9,1)).toDF("a","b","c","d")

选择以下列:

df1.select(lit(" ").as("col1"),
           col("b").as("b1"),
           (col("c") + col("d")).as("e"))

提供预期结果:

+----+---+---+
|col1| b1|  e|
+----+---+---+
|    |  1|  8|
|    |  3|  6|
|    |  4|  9|
|    |  5| 10|
+----+---+---+

相关问题