Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
701 views
in Technique[技术] by (71.8m points)

process - Simple Python Multiprocessing function in Spyder doesn't output results

I have this very simple function right here in which I'm trying to run and test on, however, it doesn't output anything and it doesn't have any errors either. I've checked the code multiple times but it doesn't have any errors.

I printed jobs and here's what I got:

[<Process(Process-12, stopped[1])>, 
<Process(Process-13, stopped[1])>,
<Process(Process-14, stopped[1])>, 
<Process(Process-15, stopped[1])>,
<Process(Process-16, stopped[1])>]

Here's the code:

import multiprocessing

def worker(num):
    print "worker ", num
    return

jobs = []
for i in range(5):
    p = multiprocessing.Process(target = worker, args = (i,))
    jobs.append(p)
    p.start()

Here's the result I'm expecting but it's not outputting anything:

Worker: 0
Worker: 1
Worker: 2
Worker: 3
Worker: 4
See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The comments revealed that OP uses Windows as well as Spyder. Since Spyder redirects stdout and Windows does not support forking, a new child process won't print into the Spyder console. This is simply due to the fact that stdout of the new child process is Python's vanilla stdout, which can also be found in sys.__stdout__.

There are two alternatives:

  1. Using the logging module. This would encompass creating and logging all messages to one or several files. Using a single log-file may lead to the problem that the output is slightly garbled since the processes would write concurrently to the file. Using a single file per process could solve this.

  2. Not using print within the child processes, but simply returning the result to the main process. Either by using a queue (or multiprocessing.Manager().Queue() since forking is not possible) or more simply by relying on the multiprocessing Pool's map functionality, see example below.

Multiprocessing example with a Pool:

import multiprocessing

def worker(num):
    """Returns the string of interest"""
    return "worker %d" % num

def main():
    pool = multiprocessing.Pool(4)
    results = pool.map(worker, range(10))

    pool.close()
    pool.join()

    for result in results:
        # prints the result string in the main process
        print(result)

if __name__ == '__main__':
    # Better protect your main function when you use multiprocessing
    main()

which prints (in the main process)

worker 0
worker 1
worker 2
worker 3
worker 4
worker 5
worker 6
worker 7
worker 8
worker 9

EDIT: If you are to impatient to wait for the map function to finish, you can immediately print your results by using imap_unordered and slightly changing the order of the commands:

def main():
    pool = multiprocessing.Pool(4)
    results = pool.imap_unordered(worker, range(10))

    for result in results:
        # prints the result string in the main process as soon as say are ready
        # but results are now no longer in order!
        print(result)

    # The pool should join after printing all results
    pool.close()
    pool.join()

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...