带大小写和sum的spark sql

nkkqxpd9  于 2021-06-25  发布在  Hive
关注(0)|答案(2)|浏览(255)

我有一个配置单元表,其中有列(id、dept、salary),我正在使用sparksql对该表应用一些转换。
i、 e.如果部门是hr,则薪资=薪资+100,如果部门是hr,则薪资=薪资+0,并基于部门取薪资之和。pfb公司
输入数据

+----+-----------+--------+
| id |   dept    | salary |
+----+-----------+--------+
|  1 | HR        |    100 |
|  2 | operation |    200 |
|  3 | tech      |    300 |
|  4 | IT        |    400 |
|  1 | HR        |    500 |
+----+-----------+--------+

预期产量

+----+-----------+--------+
| id |   dept    | salary |
+----+-----------+--------+
|  1 | HR        |    800 |
|  2 | operation |    300 |
|  3 | tech      |    300 |
|  4 | IT        |    400 |
+----+-----------+--------+

我已经写了下面的代码,但它在sparksql中不起作用。

spark.sql("select CASE WHEN dept = 'HR' THEN 'sum(salary+100)',when  dept = 'IT' THEN 'sum(salary+0)' ELSE 'salary' END AS salary from emp group by dept").show

但是获取不匹配的输入:问题。请帮助我如何实现这一点。

vaqhlq81

vaqhlq811#

Select Dept, sum(salary + case Dept when 'HR' then 100 when 'IT' then 0 else 0 end)
From emp
Group by Dept

Select Dept, sum(salary) + sum(case Dept when 'HR' then 100 when 'IT' then 0 else 0 end)
From emp
Group by Dept
gwo2fgha

gwo2fgha2#

预期输出:spark sql查询为

spark.sql("Select id,dept, sum(salary) + sum(case dept when 'HR' then 100 when 'IT' then 0 when 'operation' then 100 else 0 end) as salary  From employee1 group by id,dept order by id asc").show()


我希望这有帮助。

相关问题