Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
691 views
in Technique[技术] by (71.8m points)

multiple conditions for filter in spark data frames

I have a data frame with four fields. one of the field name is Status and i am trying to use a OR condition in .filter for a dataframe . I tried below queries but no luck.

df2 = df1.filter(("Status=2") || ("Status =3"))

df2 = df1.filter("Status=2" || "Status =3")

Has anyone used this before. I have seen a similar question on stack overflow here . They have used below code for using OR condition. But that code is for pyspark.

from pyspark.sql.functions import col 

numeric_filtered = df.where(
(col('LOW')    != 'null') | 
(col('NORMAL') != 'null') |
(col('HIGH')   != 'null'))
numeric_filtered.show()
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Instead of:

df2 = df1.filter("Status=2" || "Status =3")

Try:

df2 = df1.filter($"Status" === 2 || $"Status" === 3)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...