Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
700 views
in Technique[技术] by (71.8m points)

pyspark - Enable case sensitivity for spark.sql globally

The option spark.sql.caseSensitive controls whether column names etc should be case sensitive or not. It can be set e.g. by

spark_session.sql('set spark.sql.caseSensitive=true')

and is false per default.

It does not seem to be possible to enable it globally in $SPARK_HOME/conf/spark-defaults.conf with

spark.sql.caseSensitive: True

though. Is that intended or is there some other file to set sql options?

Also in the source it is stated that it is highly discouraged to enable this at all. What is the rationale behind that advice?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

As it turns out setting

spark.sql.caseSensitive: True

in $SPARK_HOME/conf/spark-defaults.conf DOES work after all. It just has to be done in the configuration of the Spark driver as well, not the master or workers. Apparently I forgot that when I last tried.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...