如何从列表到PySpark中的查询获取用逗号分隔的字符串?

问题描述:

我想通过使用PySpark中的列表来生成查询

I want to generate a query by using a list in PySpark

list = ["hi@gmail.com", "goodbye@gmail.com"]
query = "SELECT * FROM table WHERE email IN (" + list + ")"

这是我想要的输出:

query
SELECT * FROM table WHERE email IN ("hi@gmail.com", "goodbye@gmail.com")

相反,我得到:TypeError: cannot concatenate 'str' and 'list' objects

有人可以帮助我实现这一目标吗?谢谢

Can anyone help me achieve this? Thanks

如果某人遇到相同的问题,我发现您可以使用以下代码:

If someone's having the same issue, I found that you can use the following code:

"'"+"','".join(map(str, emails))+"'"

,您将获得以下输出:

SELECT * FROM table WHERE email IN ('hi@gmail.com', 'goodbye@gmail.com')