在pyspark中查找两个日期之间的周末天数

1u4esq0p  于 2021-07-13  发布在  Spark
关注(0)|答案(1)|浏览(434)

这是我的spark df的一个例子:

+------------+-------------------+-------------------+
|          OT|          Fecha_Fst|          Fecha_Lst|
+------------+-------------------+-------------------+
|712268242652|2021-01-30 14:43:00|2021-02-03 13:03:00|
|712268243525|2021-01-30 14:27:00|2021-02-03 14:50:00|
|712268243831|2021-02-02 21:23:00|2021-02-08 17:39:00|
|712268244225|2021-02-01 07:26:00|2021-02-09 11:22:00|
|712268244951|2021-02-01 07:25:00|2021-02-05 16:07:00|
|712268245076|2021-02-01 07:26:00|2021-02-06 13:22:00|
|712268245651|2021-01-28 16:49:00|2021-02-04 13:31:00|
|712268246782|2021-02-01 07:26:00|2021-02-05 12:24:00|
|712268247644|2021-02-02 18:20:00|2021-02-05 16:12:00|
|712268247681|2021-02-09 05:03:00|2021-02-15 14:16:00|
|712268247751|2021-02-02 15:42:00|2021-02-05 13:27:00|
|712268247854|2021-01-30 14:34:00|2021-01-30 14:34:00|
|712268248775|2021-02-02 15:42:00|2021-02-05 12:42:00|
|712268249173|2021-02-02 15:42:00|2021-02-05 15:51:00|
|712268249873|2021-02-02 09:05:00|2021-02-05 19:36:00|
|712268249884|2021-02-02 08:53:00|2021-02-05 19:36:00|
|712268249895|2021-02-02 08:14:00|2021-02-05 19:36:00|
|712268249906|2021-02-02 09:06:00|2021-02-05 19:36:00|
|712268249910|2021-02-02 08:53:00|2021-02-05 19:36:00|
|712268250186|2021-02-02 15:42:00|2021-02-05 18:59:00|
+------------+-------------------+-------------------+

我在网上找到了这个代码:

a = "2021-02-10T23:59:00.000+0000"
b = "2021-03-20T23:59:00.000+0000"
week = {}

def weekday_count(start, end):
    start_date = datetime.datetime.strptime(start, "%Y-%m-%dT%H:%M:%S.%f%z")
    end_date = datetime.datetime.strptime(end, "%Y-%m-%dT%H:%M:%S.%f%z")

    for i in range((end_date - start_date).days):
        day = calendar.day_name[(start_date + datetime.timedelta(days=i + 1)).weekday()]
        week[day] = week[day] + 1 if day in week else 1
    return week["Sunday"] + week["Saturday"]

print(weekday_count(a, b))

11

它运行良好,我得到了我想要的,但我可以´在我的spark df中没有使用它,我尝试了许多形式,但总是出现错误,如:

df = df.withColumn("Number", weekday_count(f.col("Fecha_Fst"),f.col("Fecha_Lst")))

typeerror:strTime()参数1必须是str,而不是column
如果我使用lambda:

def weekday_count(start, end):
    start_date = lambda start :datetime.datetime.strptime(start, "%Y-%m-%dT%H:%M:%S.%f%z")
    end_date = lambda end :datetime.datetime.strptime(end, "%Y-%m-%dT%H:%M:%S.%f%z")

    for i in range((end_date - start_date).days):
        day = calendar.day_name[(start_date + datetime.timedelta(days=i + 1)).weekday()]
        week[day] = week[day] + 1 if day in week else 1
    return week["Sunday"] + week["Saturday"]

df = df.withColumn("Number", weekday_count(f.col("Fecha_Fst"),f.col("Fecha_Lst")))

typeerror:不支持-:“function”和“function”的操作数类型
等等。。。我今天尝试了很多形式,但都没有得到想要的结果:

+------------+-------------------+-------------------+-----------+
|          OT|          Fecha_Fst|          Fecha_Lst|       Days|
+------------+-------------------+-------------------+-----------+
|712268242652|2021-01-30 14:43:00|2021-02-03 13:03:00|          2|
|712268243831|2021-02-02 21:23:00|2021-02-08 17:39:00|          2|
|712268244225|2021-02-01 07:26:00|2021-02-09 11:22:00|          2| 
|712268244951|2021-02-01 07:25:00|2021-02-05 16:07:00|          0|
|712268247681|2021-02-09 05:03:00|2021-02-15 14:16:00|          2|
|712268247854|2021-01-30 14:34:00|2021-01-30 14:34:00|          1|
|712268248775|2021-02-02 15:42:00|2021-02-05 12:42:00|          0|
|712268249173|2021-02-02 15:42:00|2021-02-05 15:51:00|          0|
|712268249873|2021-02-02 09:05:00|2021-02-05 19:36:00|          0|
+------------+-------------------+-------------------+-----------+

我对如何使用pyspark库中的新列感到困惑,我使用了pandas,但我目前正在使用azuredatabricks环境,pandas非常慢。

c90pui9n

c90pui9n1#

不能直接使用python函数。你需要创建一个pyspark自定义项。
但是,使用spark内置函数实际上可以获得所需的结果。创建一个时间戳序列,从 Fecha_FstFecha_Lst 然后过滤那些与 Sat 或者 Sun 天。结果数组的大小是周末天数。

from pyspark.sql import functions as F

 df1 = df.withColumn(
    "Days",
    F.expr(
        """
        size(
            filter(
                sequence(cast(Fecha_Fst as timestamp), cast(Fecha_Lst as timestamp), interval 1 day),
                x -> date_format(x, 'E') in ('Sat', 'Sun')
            )
        )
        """
    )
) 

df1.show(truncate=False)

# +------------+-------------------+-------------------+----+

# |OT          |Fecha_Fst          |Fecha_Lst          |Days|

# +------------+-------------------+-------------------+----+

# |712268242652|2021-01-30 14:43:00|2021-02-03 13:03:00|2   |

# |712268243831|2021-02-02 21:23:00|2021-02-08 17:39:00|2   |

# |712268244225|2021-02-01 07:26:00|2021-02-09 11:22:00|2   |

# |712268244951|2021-02-01 07:25:00|2021-02-05 16:07:00|0   |

# |712268247681|2021-02-09 05:03:00|2021-02-15 14:16:00|2   |

# |712268247854|2021-01-30 14:34:00|2021-01-30 14:34:00|1   |

# |712268248775|2021-02-02 15:42:00|2021-02-05 12:42:00|0   |

# |712268249173|2021-02-02 15:42:00|2021-02-05 15:51:00|0   |

# |712268249873|2021-02-02 09:05:00|2021-02-05 19:36:00|0   |

# +------------+-------------------+-------------------+----+

相关问题