Spark:将具有空值的CSV写入为空列

Spark:将具有空值的CSV写入为空列

问题描述:

我正在使用PySpark这样将数据帧写入CSV文件:

I'm using PySpark to write a dataframe to a CSV file like this:

df.write.csv(PATH, nullValue='')

该数据框中有一个字符串类型的列.一些值是空的.这些空值显示如下:

There is a column in that dataframe of type string. Some of the values are null. These null values display like this:

...,"",...

我希望它们改为这样显示:

I would like them to be display like this instead:

...,,...

csv.write()中使用选项是否可以实现?

Is this possible with an option in csv.write()?

谢谢!

轻松设置了emptyValue选项

emptyValue:设置一个空值的字符串表示形式.如果设置为None,则使用默认值"".

emptyValue: sets the string representation of an empty value. If None is set, it use the default value, "".

from pyspark import Row
from pyspark.shell import spark

df = spark.createDataFrame([
    Row(col_1=None, col_2='20151231', col_3='Hello'),
    Row(col_1=2, col_2='20160101', col_3=None),
    Row(col_1=3, col_2=None, col_3='World')
])

df.write.csv(PATH, header=True, emptyValue='')

输出

col_1,col_2,col_3
,20151231,Hello
2,20160101,
3,,World