使用pyspark.sql.functions unix_timestamp获取null

时间:2017-08-15 16:58:02

标签: python apache-spark pyspark unix-timestamp

我尝试使用此代码将字符串从字符串转换为时间戳

from pyspark.sql.functions import unix_timestamp
(sc
.parallelize([Row(dt='2017-01-23T08:12:39.929+01:00')])
.toDF()
.withColumn("parsed", unix_timestamp("dt", "yyyy-MM-ddThh:mm:ss")
.cast("double")
.cast("timestamp"))
.show(1, False))

但我得到了空

+-----------------------------+------+
|dt                           |parsed|
+-----------------------------+------+
|2017-01-23T08:12:39.929+01:00|null  |
+-----------------------------+------+

为什么?

1 个答案:

答案 0 :(得分:4)

您获得NULL,因为您使用的格式与数据不匹配。要获得最小匹配,您必须使用单引号转义T

yyyy-MM-dd'T'kk:mm:ss

并且匹配完整模式,您需要S毫秒和X时区:

yyyy-MM-dd'T'kk:mm:ss.SSSXXX

但在当前的Spark版本中直接cast

from pyspark.sql.functions import col

col("dt").cast("timestamp")

应该可以正常工作:

spark.sql(
    """SELECT CAST("2011-01-23T08:12:39.929+01:00" AS timestamp)"""
).show(1, False)
+------------------------------------------------+
|CAST(2011-01-23T08:12:39.929+01:00 AS TIMESTAMP)|
+------------------------------------------------+
|2011-01-23 08:12:39.929                         |
+------------------------------------------------+

参考:SimpleDateFormat