Using multiprocessing with runpy

pytest-cov multiprocessing
python multiprocessing
pytest-coverage
freeze_support multiprocessing
multiprocessing python 3
coverage python
python multiprocessing example
multiprocessing explained

I have a Python module that uses multiprocessing. I'm executing this module from another script with runpy. However, this results in (1) the module running twice, and (2) the multiprocessing jobs never finish (the script just hangs).

In my minimal working example, I have a script runpy_test.py:

import runpy
runpy.run_module('module_test')

and a directory module_test containing an empty __init__.py and a __main__.py:

from multiprocessing import Pool

print 'start'
def f(x):
    return x*x
pool = Pool()
result = pool.map(f, [1,2,3])
print 'done'

When I run runpy_test.py, I get:

start
start

and the script hangs.

If I remove the pool.map call (or if I run __main__.py directly, including the pool.map call), I get:

start
done

I'm running this on Scientific Linux 7.6 in Python 2.7.5.


Rewrite your __main__.py like so:

from multiprocessing import Pool
from .implementation import f

print 'start'
pool = Pool()
result = pool.map(f, [1,2,3])
print 'done'

And then write an implementation.py (you can call this whatever you want) in which your function is defined:

def f(x):
    return x*x

Otherwise you will have the same problem with most interfaces in multiprocessing, and independently of using runpy. As @Weeble explained, when Pool.map tries to load the function f in each sub-process it will import <your_package>.__main__ where your function is defined, but since you have executable code at module-level in __main__ it will be re-executed by the sub-process.

Aside from this technical reason, this is also better design in terms of separation of concerns and testing. Now you can easily import and call (including for test purposes) the function f without running it in parallel.

Issue 19978: Update multiprocessing.spawn to use runpy.run_path , Once the 'target' parameter for runpy.run_path lands then multiprocessing.spawn should be updated to use it. msg349122 - (view), Author:  In order to run the library as a module (i.e. using python -m my_lib_name when it is installed via pip) – George Jun 5 at 12:59 1 Even if your file was called my_lib_name.py and you ran it with python -m my_lib_name , the __name__ variable in your script would be set to __main__ – rdas Jun 5 at 13:02


Try defining your function f in a separate module. It needs to be serialised to be passed to the pool processes, and then those processes need to recreate it, by importing the module it occurs in. However, the __main__.py file it occurs in isn't a module, or at least, not a well-behaved one. Attempting to import it would result in the creation of another Pool and another invocation of map, which seems like a recipe for disaster.

coverage run and multiprocessing problem · Issue #745 · nedbat , Hi, I'm facing some issue with coverage and multiprocessing module. "c:\users\​mwalt\scoop\apps\python\current\lib\runpy.py", line 263, in  The multiprocessing package supports spawning processes. It refers to a function that loads and executes a new child processes. For the child to terminate or to continue executing concurrent computing,then the current process hasto wait using an API, which is similar to threading module.


Although not the "right" way to do it, one solution that ended up working for me was to use runpy's _run_module_as_main instead of run_module. This was ideal for me since I was working with someone else's code and required the fewest changes.

[doc] example may need updating · Issue #528 · hill-a/stable , Describe the bug The multiprocessing cartpole example failed for me /Python.​framework/Versions/3.7/lib/python3.7/runpy.py", line 85, in  Introduction¶. multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads.


Python Parallel Programing, Hi all, this is the first time I am using the parallel program in python, my program is \Python\Python37\lib\multiprocessing\spawn.py", line 114, in _main \Local\​Programs\Python\Python37\lib\runpy.py", line 85, in _run_code  λ python --version Python 2.7.12 λ pip --version pip 8.1.2 λ flake8 --version 3.0.1 (pyflakes: 1.2.3, pycodestyle: 2.0.0, mccabe: 0.5.0) CPython 2.7.12 on Windows


[#BEAM-9249] Python SDK DirectRunner with multiprocessing fails , When running a pipeline using the Python SDK, DirectRunner, and a last): File "/opt/man/releases/python-medusa/36-1/lib/python3.6/runpy.py", line 193, in  Using multiprocessing with runpy. I have a Python module that uses multiprocessing. I'm executing this module from another script with runpy. However, this results in


Stop or Cancel button - Questions, Is there any mechanism is to cancel or stop a running process using a as st import time from SessionState import get from multiprocessing import File "c:\​users\mohsen\anaconda3\envs\tf_gpu\lib\runpy.py", line 263, in  Messages (3) msg338870 - Author: Paul Moore (paul.moore) * Date: 2019-03-26 10:49; If I run the following sample program using Python 3.7.3 (64 bit) for Windows, it immediately fails, producing a massive traceback.