Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
207 views
in Technique[技术] by (71.8m points)

python - Config file to define JSON Schema Structure in PySpark

I have created a PySpark application that reads the JSON file in a dataframe through a defined Schema. code sample below

schema = StructType([
    StructField("domain", StringType(), True),
     StructField("timestamp", LongType(), True),                            
])
df= sqlContext.read.json(file, schema)

I need a way to find how can I define this schema in a kind of config or ini file etc. And read that in the main the PySpark application.

This will help me to modify schema for the changing JSON if there is any need in future without changing the main PySpark code.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

StructType provides json and jsonValue methods which can be used to obtain json and dict representation respectively and fromJson which can be used to convert Python dictionary to StructType.

schema = StructType([
    StructField("domain", StringType(), True),
    StructField("timestamp", LongType(), True),                            
])

StructType.fromJson(schema.jsonValue())

The only thing you need beyond that is built-in json module to parse input to the dict that can be consumed by StructType.

For Scala version see How to create a schema from CSV file and persist/save that schema to a file?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...