如何在spark中将一列字符串日期转换为一列unix epochs

zpqajqem  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(410)

我是spark和scala的新手,我想把一列字符串日期转换成unix时代。我的Dataframe如下所示:

+----------+-------+
|   Dates  |Reports|
+----------+-------+
|2020-07-20|     34|
|2020-07-21|     86|
|2020-07-22|    129|
|2020-07-23|     98|
+--------+---------+
The output should be 
+----------+-------+
|   Dates  |Reports|
+----------+-------+
|1595203200|     34|
|1595289600|     86|
|1595376000|    129|
|1595462400|     98|
+--------+---------+
``
4si2a6ki

4si2a6ki1#

使用 unix_timestamp .

val df = Seq(("2020-07-20")).toDF("date")
df.show
df.withColumn("unix_time", unix_timestamp('date, "yyyy-MM-dd")).show

+----------+
|      date|
+----------+
|2020-07-20|
+----------+

+----------+----------+
|      date| unix_time|
+----------+----------+
|2020-07-20|1595203200|
+----------+----------+

相关问题