如何在spark中将一列字符串日期转换为一列unix epochs

zpqajqem  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(455)

我是spark和scala的新手,我想把一列字符串日期转换成unix时代。我的Dataframe如下所示:

  1. +----------+-------+
  2. | Dates |Reports|
  3. +----------+-------+
  4. |2020-07-20| 34|
  5. |2020-07-21| 86|
  6. |2020-07-22| 129|
  7. |2020-07-23| 98|
  8. +--------+---------+
  9. The output should be
  10. +----------+-------+
  11. | Dates |Reports|
  12. +----------+-------+
  13. |1595203200| 34|
  14. |1595289600| 86|
  15. |1595376000| 129|
  16. |1595462400| 98|
  17. +--------+---------+
  18. ``
4si2a6ki

4si2a6ki1#

使用 unix_timestamp .

  1. val df = Seq(("2020-07-20")).toDF("date")
  2. df.show
  3. df.withColumn("unix_time", unix_timestamp('date, "yyyy-MM-dd")).show
  4. +----------+
  5. | date|
  6. +----------+
  7. |2020-07-20|
  8. +----------+
  9. +----------+----------+
  10. | date| unix_time|
  11. +----------+----------+
  12. |2020-07-20|1595203200|
  13. +----------+----------+

相关问题