Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
879 views
in Technique[技术] by (71.8m points)

python - MemoryError using json.dumps()

I would like to know which one of json.dump() or json.dumps() are the most efficient when it comes to encoding a large array to json format.

Can you please show me an example of using json.dump()?

Actually I am making a Python CGI that gets large amount of data from a MySQL database using the ORM SQlAlchemy, and after some user triggered processing, I store the final output in an Array that I finally convert to Json.

But when converting to JSON with :

 print json.dumps({'success': True, 'data': data}) #data is my array

I get the following error:

Traceback (most recent call last):
  File "C:/script/cgi/translate_parameters.py", line 617, in     <module>
f.write(json.dumps(mytab,default=dthandler,indent=4))
  File "C:Python27libjson\__init__.py", line 250, in dumps
    sort_keys=sort_keys, **kw).encode(obj)
  File "C:Python27libjsonencoder.py", line 209, in encode
    chunks = list(chunks)
MemoryError

So, my guess is using json.dump() to convert data by chunks. Any ideas on how to do this?

Or other ideas besides using json.dump()?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You can simply replace

f.write(json.dumps(mytab,default=dthandler,indent=4))

by

json.dump(mytab, f, default=dthandler, indent=4)

This should "stream" the data into the file.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...