在 Pyspark 中将列类型从字符串更改为日期
问题描述:
我正在尝试将我的列类型从字符串更改为日期.我参考了以下答案:
I'm trying to change my column type from string to date. I have consulted answers from:
当我尝试应用链接 1 中的答案时,结果却为空,因此我参考了链接 2 中的答案,但我不明白这部分:
When I tried to apply answers from link 1, I got null result instead, so I referred to answer from link 2 but I don't understand this part:
output_format = ... # Some SimpleDateFormat string
我想直接从评论中提问,但唉,我的声誉还不够.
I would like to ask directly from the comment but alas, my reputation is not enough.
答
希望对您有所帮助!
from pyspark.sql.functions import col, unix_timestamp, to_date
#sample data
df = sc.parallelize([['12-21-2006'],
['05-30-2007'],
['01-01-1984'],
['12-24-2017']]).toDF(["date_in_strFormat"])
df.printSchema()
df = df.withColumn('date_in_dateFormat',
to_date(unix_timestamp(col('date_in_strFormat'), 'MM-dd-yyyy').cast("timestamp")))
df.show()
df.printSchema()
输出为:
root
|-- date_in_strFormat: string (nullable = true)
+-----------------+------------------+
|date_in_strFormat|date_in_dateFormat|
+-----------------+------------------+
| 12-21-2006| 2006-12-21|
| 05-30-2007| 2007-05-30|
| 01-01-1984| 1984-01-01|
| 12-24-2017| 2017-12-24|
+-----------------+------------------+
root
|-- date_in_strFormat: string (nullable = true)
|-- date_in_dateFormat: date (nullable = true)