-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Preventing oversubscription with multithreaded code? #1012
Labels
Comments
xdist does not use or control threads in a direct manner |
I've worked out a basic version of what I'd like to have. I suspect there are a number of edge cases that this would hit with @pytest.fixture(autouse=True, scope="session")
def limit_threading():
import os
import threadpoolctl
if "PYTEST_XDIST_WORKER_COUNT" in os.environ:
n_workers = int(os.environ["PYTEST_XDIST_WORKER_COUNT"])
max_threads = os.cpu_count() // n_workers
with threadpoolctl.threadpool_limits(limits=max_threads):
yield Is significantly cutting down time to run my test suite on a machine with 16 cores when using |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This is probably a feature request.
I've been finding that running
pytest -n auto
can take or even just seem to hang when testing one of my codebases. I believe the culprit is that a number of tests using matrix multiplication or parallelized numba functions are being hit at the same time. Both of these cases default to using the number of cores as the default number of threads – so my CPU becomes heavily oversubscribed.Would it be possible for pytest-xdist to use something like
threadpoolctl
to limit the number of threads each worker uses? Ideally it could be similar to howjoblib
ordask
set the number of threads available to each worker to something likehardware_threads // num_workers
.Alternatively, is there a good way I could set this behaviour myself? Ideally without hardcoding the number of threads to use.
The text was updated successfully, but these errors were encountered: