I'm looking to be able to run one Python file, and have this file spawn 3 or so Threads/Processes to run other python files 'concurrently', however, I would like the original file (which spawned the three threads/processes) to be able to monitor all 3 threads/processes and restart them if they crash and report errors etc.
I have been looking at using _thread
and other methods, however it appears that you can only run a function in a thread, not another file. This isn't possible for me, it needs to be another file that is run.
I looked at using subprocess.run()
, which does succesfully run the external python file, however it would not allow me to spawn 3 subprocesses as it halts he execution whilst the program is running.
I ran this test below, and the output prints "before"
, and then starts running the test program, which just blinks an LED indefinitely, so "after"
is never printed.
def launch_threads():
print("before")
run(["python3 test.py"], check=True, shell=True)
print("after")
I can think of a method of perhaps spawning three threads, and then using the functions called by these threads to spawn the 3 subprocesses, but was wondering if there was a better way to accomplish this? Pseudocode of this theory below.
def function1(): # Example thread function
try:
run(["python3 program1.py"], check=True, shell=True)
except:
handle_crash_and_restart_prog()
def launch_threads():
thread1 = new_thread(function1)
thread2 = new_thread(function2)
thread3 = new_thread(function3)
question from:
https://stackoverflow.com/questions/66065712/how-to-start-and-concurrently-monitor-multiple-threads-processes-in-python 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…