GithubHelp home page GithubHelp logo

pytest-forked's Introduction

pytest-forked: run each test in a forked subprocess

Warning

this is a extraction of the xdist --forked module, future maintenance beyond the bare minimum is not planned until a new maintainer is found.

This plugin does not work on Windows because there's no fork support.

  • --forked: run each test in a forked subprocess to survive SEGFAULTS or otherwise dying processes.

python version ci pre-commit black

Installation

Install the plugin with:

pip install pytest-forked

or use the package in develope/in-place mode with a checkout of the pytest-forked repository

pip install -e .

Usage examples

If you have tests involving C or C++ libraries you might have to deal with tests crashing the process. For this case you may use the boxing options:

pytest --forked

which will run each test in a subprocess and will report if a test crashed the process. You can also combine this option with running multiple processes via pytest-xdist to speed up the test run and use your CPU cores:

pytest -n3 --forked

this would run 3 testing subprocesses in parallel which each create new forked subprocesses for each test.

You can also fork for individual tests:

@pytest.mark.forked
def test_with_leaky_state():
    run_some_monkey_patches()

This test will be unconditionally boxed, regardless of CLI flag.

pytest-forked's People

Contributors

asottile avatar bluetech avatar felixonmars avatar hugovk avatar mgorny avatar nicoddemus avatar pre-commit-ci[bot] avatar ronnypfannschmidt avatar stanislavlevin avatar swt2c avatar the-compiler avatar tucked avatar untitaker avatar webknjaz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pytest-forked's Issues

next release ETA

Hello, I see that you have prepared commit merged in master about python3.10 support.
When do you plan to release the version including this commit?
Thanks

Build failure due to setuptools in pyproject.toml

It looks like https://github.com/pytest-dev/pytest-forked/blob/master/pyproject.toml is causing build failures due to setuptools ~= 41.4 (console output below). This has been seen in other packages as well. A possible solution is to get rid of the file, or at least the setuptools pin.

10:13:52 Collecting  #pytest-forked
10:13:52   Downloading pytest-forked-1.3.0.tar.gz (9.8 kB)
10:13:52   Installing build dependencies: started
10:13:53   Installing build dependencies: finished with status 'error'
10:13:53   ERROR: Command errored out with exit status 1:
10:13:53    command: /usr/local/bin/python /usr/local/lib/python3.7/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-hbrecn75/overlay --no-warn-script-location --no-binary :all: --only-binary :none: -i http://build.locusdev.net:10001/simple --extra-index-url https://pypi.python.org/simple --trusted-host build.locusdev.net -- 'setuptools ~= 41.4' 'setuptools_scm ~= 3.3' 'wheel ~= 0.33.6'
10:13:53        cwd: None
10:13:53   Complete output (20 lines):
10:13:53   Looking in indexes: http://build.locusdev.net:10001/simple, https://pypi.python.org/simple/, https://pypi.python.org/simple
10:13:53   Collecting setuptools~=41.4
10:13:53     Downloading setuptools-41.6.0.zip (852 kB)
10:13:53       ERROR: Command errored out with exit status 1:
10:13:53        command: /usr/local/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-fobv7n3k/setuptools/setup.py'"'"'; __file__='"'"'/tmp/pip-install-fobv7n3k/setuptools/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-install-fobv7n3k/setuptools/pip-egg-info
10:13:53            cwd: /tmp/pip-install-fobv7n3k/setuptools/
10:13:53       Complete output (9 lines):
10:13:53       Traceback (most recent call last):
10:13:53         File "<string>", line 1, in <module>
10:13:53         File "/tmp/pip-install-fobv7n3k/setuptools/setuptools/__init__.py", line 6, in <module>
10:13:53           import distutils.core
10:13:53         File "/usr/local/lib/python3.7/site-packages/_distutils_hack/__init__.py", line 83, in create_module
10:13:53           return importlib.import_module('setuptools._distutils')
10:13:53         File "/usr/local/lib/python3.7/importlib/__init__.py", line 127, in import_module
10:13:53           return _bootstrap._gcd_import(name[level:], package, level)
10:13:53       ModuleNotFoundError: No module named 'setuptools._distutils'
10:13:53       ----------------------------------------
10:13:53   ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
10:13:53   WARNING: You are using pip version 20.0.2; however, version 20.2.2 is available.
10:13:53   You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
10:13:53   ----------------------------------------
10:13:53 ERROR: Command errored out with exit status 1: /usr/local/bin/python /usr/local/lib/python3.7/site-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-hbrecn75/overlay --no-warn-script-location --no-binary :all: --only-binary :none: -i http://build.locusdev.net:10001/simple --extra-index-url https://pypi.python.org/simple --trusted-host build.locusdev.net -- 'setuptools ~= 41.4' 'setuptools_scm ~= 3.3' 'wheel ~= 0.33.6' Check the logs for full command output.
10:13:53 WARNING: You are using pip version 20.0.2; however, version 20.2.2 is available.
10:13:53 You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.

pytest --boxed test in windows assert 'ForkedFunc' object has no attribute 'pid'

INTERNALERROR>   File "c:\python35-x64\lib\site-packages\pytest_forked\__init__.py", line 35, in pytest_runtest_protocol
INTERNALERROR>     reports = forked_run_report(item)
INTERNALERROR>   File "c:\python35-x64\lib\site-packages\pytest_forked\__init__.py", line 55, in forked_run_report
INTERNALERROR>     ff = py.process.ForkedFunc(runforked)
INTERNALERROR>   File "c:\python35-x64\lib\site-packages\py\_process\forkedfunc.py", line 45, in __init__
INTERNALERROR>     pid = os.fork()
INTERNALERROR> AttributeError: module 'os' has no attribute 'fork'
======================== no tests ran in 0.17 seconds =========================
Exception ignored in: <bound method ForkedFunc.__del__ of <py._process.forkedfunc.ForkedFunc object at 0x000000E9B647F898>>
Traceback (most recent call last):
  File "c:\python35-x64\lib\site-packages\py\_process\forkedfunc.py", line 110, in __del__
    if self.pid is not None:  # only clean up in main process
AttributeError: 'ForkedFunc' object has no attribute 'pid'

details:
https://ci.appveyor.com/project/nooperpudd/quantbube/build/1.0.38/job/7pdjtq6p78hu2127

[tests] test_xfail fails against Pytest 8

============================= test session starts ==============================
platform linux -- Python 3.12.1, pytest-8.0.0, pluggy-1.4.0
rootdir: /usr/src/RPM/BUILD/pytest-forked
configfile: tox.ini
plugins: forked-1.6.1.dev4+gd9d05e2
collected 10 items

testing/test_boxed.py ...xx.                                             [ 60%]
testing/test_xfail_behavior.py ...F                                      [100%]

=================================== FAILURES ===================================
_________________________ test_xfail[non-strict xpass] _________________________

is_crashing = False, is_strict = False
testdir = <Testdir local('/usr/src/tmp/pytest-of-builder/pytest-4/test_xfail3')>

    @pytest.mark.parametrize(
        ("is_crashing", "is_strict"),
        (
            pytest.param(True, True, id="strict xfail"),
            pytest.param(False, True, id="strict xpass"),
            pytest.param(True, False, id="non-strict xfail"),
            pytest.param(False, False, id="non-strict xpass"),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""

...
   
        pytest_run_result = testdir.runpytest(test_module, "-ra")
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.12.1, pytest-8.0.0, pluggy-1.4.0'
E           and: 'rootdir: /usr/src/tmp/pytest-of-builder/pytest-4/test_xfail3'
E       fnmatch: 'plugins: forked*'
E          with: 'plugins: forked-1.6.1.dev4+gd9d05e2'
E       exact match: 'collected 1 item'
E       nomatch: 'test_xfail.py X*'
E           and: ''
E       fnmatch: 'test_xfail.py X*'
E          with: 'test_xfail.py X                                                          [100%]'
E       nomatch: '*==== short test summary info ====*'
E           and: ''
E           and: '=================================== XPASSES ===================================='
E       fnmatch: '*==== short test summary info ====*'
E          with: '=========================== short test summary info ============================'
E       nomatch: 'XPASS test_xfail.py::test_function The process gets terminated'
E           and: 'XPASS test_xfail.py::test_function - The process gets terminated'
E           and: '============================== 1 xpassed in 0.03s =============================='
E       remains unmatched: 'XPASS test_xfail.py::test_function The process gets terminated'

/usr/src/RPM/BUILD/pytest-forked/testing/test_xfail_behavior.py:121: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.12.1, pytest-8.0.0, pluggy-1.4.0
rootdir: /usr/src/tmp/pytest-of-builder/pytest-4/test_xfail3
plugins: forked-1.6.1.dev4+gd9d05e2
collected 1 item

test_xfail.py X                                                          [100%]

=================================== XPASSES ====================================
=========================== short test summary info ============================
XPASS test_xfail.py::test_function - The process gets terminated
============================== 1 xpassed in 0.03s ==============================

https://docs.pytest.org/en/stable/changelog.html#pytest-8-0-0rc2-2024-01-17

For xpasses, add - in summary between test name and reason, to match how xfail is displayed.

Missing warnings with pytest --forked option

Hi,

I found out in openai/gym#2647 that the pytest --forked option seems to miss to report many warnings.

With the --forked option I get only 2 warnings:

❯ docker run gym-docker pytest --forked --import-mode=append
[...]
============================= test session starts ==============================
platform linux -- Python 3.10.2, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /usr/local/gym
plugins: forked-1.4.0
collected 623 items / 5 skipped / 618 selected
[...]
=============================== warnings summary ===============================
../lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py:9
  /usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py:9: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    import distutils.spawn

../lib/python3.10/site-packages/gym/spaces/box.py:78
  /usr/local/lib/python3.10/site-packages/gym/spaces/box.py:78: UserWarning: WARN: Box bound precision lowered by casting to float32
    logger.warn(f"Box bound precision lowered by casting to {self.dtype}")

-- Docs: https://docs.pytest.org/en/stable/warnings.html
================= 621 passed, 7 skipped, 2 warnings in 39.42s ==================

Now without the --forked option I get ~1500 warnings:

❯ docker run gym-docker pytest --import-mode=append
[...]
============================= test session starts ==============================
platform linux -- Python 3.10.2, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /usr/local/gym
plugins: forked-1.4.0
collected 623 items / 5 skipped / 618 selected
[...]
=============================== warnings summary ===============================
../lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py:9
  /usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py:9: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    import distutils.spawn

../lib/python3.10/site-packages/gym/spaces/box.py:78
tests/wrappers/test_normalize.py::test_normalize_observation[False]
tests/wrappers/test_normalize.py::test_normalize_observation[True]
tests/wrappers/test_normalize.py::test_normalize_reset_info
tests/wrappers/test_normalize.py::test_normalize_return
tests/wrappers/test_normalize.py::test_normalize_observation_vector_env
tests/wrappers/test_normalize.py::test_normalize_return_vector_env
  /usr/local/lib/python3.10/site-packages/gym/spaces/box.py:78: UserWarning: WARN: Box bound precision lowered by casting to float32
    logger.warn(f"Box bound precision lowered by casting to {self.dtype}")

tests/envs/test_determinism.py: 80 warnings
tests/envs/test_envs.py: 137 warnings
tests/spaces/test_spaces.py: 266 warnings
tests/spaces/test_utils.py: 176 warnings
tests/utils/test_env_checker.py: 2 warnings
tests/vector/test_async_vector_env.py: 16 warnings
tests/vector/test_numpy_utils.py: 64 warnings
tests/vector/test_shared_memory.py: 128 warnings
tests/vector/test_sync_vector_env.py: 8 warnings
tests/wrappers/test_normalize.py: 6 warnings
tests/wrappers/test_record_episode_statistics.py: 181 warnings
tests/wrappers/test_record_video.py: 398 warnings
tests/wrappers/test_time_aware_observation.py: 2 warnings
tests/wrappers/test_transform_observation.py: 1 warning
tests/wrappers/test_transform_reward.py: 14 warnings
  /usr/local/lib/python3.10/site-packages/gym/utils/seeding.py:47: DeprecationWarning: WARN: Function `rng.randint(low, [high, size, dtype])` is marked as deprecated and will be removed in the future. Please use `rng.integers(low, [high, size, dtype])` instead.
    deprecation(

tests/envs/test_envs.py::test_env[spec1]
tests/envs/test_envs.py::test_env[spec2]
tests/envs/test_envs.py::test_env[spec10]
tests/envs/test_envs.py::test_env[spec11]
  /usr/local/lib/python3.10/site-packages/gym/utils/env_checker.py:116: UserWarning: WARN: Agent's minimum observation space value is -infinity. This is probably too low.
    logger.warn(

tests/envs/test_envs.py::test_env[spec1]
tests/envs/test_envs.py::test_env[spec2]
tests/envs/test_envs.py::test_env[spec10]
tests/envs/test_envs.py::test_env[spec11]
  /usr/local/lib/python3.10/site-packages/gym/utils/env_checker.py:120: UserWarning: WARN: Agent's maxmimum observation space value is infinity. This is probably too high
    logger.warn(

tests/envs/test_envs.py::test_env[spec4]
tests/envs/test_envs.py::test_env[spec14]
  /usr/local/lib/python3.10/site-packages/gym/utils/env_checker.py:162: UserWarning: WARN: We recommend you to use a symmetric and normalized Box action space (range=[-1, 1]) cf https://stable-baselines3.readthedocs.io/en/master/guide/rl_tips.html
    logger.warn(

tests/envs/test_envs.py::test_env[spec14]
tests/envs/test_envs.py::test_env[spec14]
tests/envs/test_envs.py::test_env[spec14]
tests/envs/test_envs.py::test_env[spec14]
  /usr/local/lib/python3.10/site-packages/gym/envs/classic_control/pendulum.py:183: DeprecationWarning: In future, it will be an error for 'np.bool_' scalars to be interpreted as an index
    scale_img = pygame.transform.flip(scale_img, is_flip, True)

tests/spaces/test_spaces.py::test_box_dtype_check
tests/vector/test_spaces.py::test_iterate[Box0]
tests/vector/test_spaces.py::test_iterate[Box4]
tests/vector/test_spaces.py::test_iterate[Dict1]
tests/vector/test_spaces.py::test_iterate_custom_space[Tuple]
  /usr/local/lib/python3.10/site-packages/gym/spaces/box.py:153: UserWarning: WARN: Casting input x to numpy array.    logger.warn("Casting input x to numpy array.")

tests/vector/test_async_vector_env.py::test_custom_space_async_vector_env_shared_memory
  /usr/local/lib/python3.10/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning: Exception ignored in: <function AsyncVectorEnv.__del__ at 0x7f0366be9ea0>
  
  Traceback (most recent call last):
    File "/usr/local/lib/python3.10/site-packages/gym/vector/async_vector_env.py", line 631, in __del__
      self.close(terminate=True)
    File "/usr/local/lib/python3.10/site-packages/gym/vector/vector_env.py", line 193, in close
      self.close_extras(**kwargs)
    File "/usr/local/lib/python3.10/site-packages/gym/vector/async_vector_env.py", line 543, in close_extras
      if self._state != AsyncState.DEFAULT:
  AttributeError: 'AsyncVectorEnv' object has no attribute '_state'
  
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))

tests/wrappers/test_record_video.py::test_record_video_reset_return_info
  /usr/local/lib/python3.10/site-packages/_pytest/unraisableexception.py:78: PytestUnraisableExceptionWarning: Exception ignored in: <function VideoRecorder.__del__ at 0x7f0366c06b90>
  
  Traceback (most recent call last):
    File "/usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py", line 193, in __del__
      self.close()
    File "/usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py", line 182, in close
      self.write_metadata()
    File "/usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py", line 188, in write_metadata
      with open(self.metadata_path, "w") as f:
  FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/gym/videos/rl-video-step-0.meta.json'
  
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))

tests/wrappers/test_video_recorder.py::test_record_breaking_render_method
  /usr/local/lib/python3.10/site-packages/gym/wrappers/monitoring/video_recorder.py:136: UserWarning: WARN: Env returned None on render(). Disabling further rendering for video recorder by marking as disabled: path=/tmp/tmpqrycjchl.mp4 metadata_path=/tmp/tmpqrycjchl.meta.json
    logger.warn(

-- Docs: https://docs.pytest.org/en/stable/warnings.html
================ 621 passed, 7 skipped, 1509 warnings in 21.03s ================

Steps to reproduce:

git clone [email protected]:openai/gym.git
cd gym
wget https://www.roboti.us/file/mjkey.txt
docker build -f py.Dockerfile --build-arg MUJOCO_KEY=./mjkey.txt --build-arg PYTHON_VERSION=3.10 --tag gym-docker .

Then run either docker run gym-docker pytest --forked --import-mode=append or docker run gym-docker pytest --import-mode=append

Let me know if I can do anything else to help troubleshoot!

Why pytest-forked helps ?

Hi, I have been usign the pytest-forked plugin which is very helpful.
I am wondering why sometimes tests failed when they are ran one after the other, but then pass when they are ran in separate processes thanks to your plugin.
I have a use case where I'd like to work with regular pytest (without --forked) option, and I am trying to troubleshoot why one test is failing after another when running without --forked.

Thanks for any hints.

Forked mode by default

Hi!
I'd like to run pytest in forked mode by default, could you add an ini option?

Upd. Nevermind, I just found out addopts

Useful against pollution of interpreter state

Just wanted to point that this package can be a useful protection against "pollution" of the interpreter state.

For example when testing behaviour of custom imports, given the global state of sys, and the cache mechanisms, having boxed processes prevent unknowingly reusing a module imported from another test.
And there might be more cases like this one.

We should probably gather them in one place (README ?) as they might be a bit esoteric for the newcomer, and it would be useful to pin them down.

test_xfail[strict_xfail|non_strict_xfail] fail with pytest 7.2.0

After upgrading pytest from 7.1.3 to 7.2.0 in nixpkgs we're seeing the following test failures:

pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0 -- /nix/store/7c1qwrqf1grxa31wvq2na0vg2val7p9w-python3-3.10.9/bin/python3.10
pytest-forked> cachedir: .pytest_cache
pytest-forked> rootdir: /build/pytest-forked-1.4.0, configfile: tox.ini
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 10 items                                                             
pytest-forked> 
pytest-forked> testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (capture cleanup needed) [ 40%]
pytest-forked> testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (capture cleanup needed) [ 50%]
pytest-forked> testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[strict xfail] FAILED          [ 70%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[strict xpass] PASSED          [ 80%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[non-strict xfail] FAILED      [ 90%]
pytest-forked> testing/test_xfail_behavior.py::test_xfail[non-strict xpass] PASSED      [100%]
pytest-forked> 
pytest-forked> =================================== FAILURES ===================================
pytest-forked> ___________________________ test_xfail[strict xfail] ___________________________
pytest-forked> 
pytest-forked> is_crashing = True, is_strict = True
pytest-forked> testdir = <Testdir local('/build/pytest-of-nixbld/pytest-0/test_xfail0')>
pytest-forked> 
pytest-forked>     @pytest.mark.parametrize(
pytest-forked>         ("is_crashing", "is_strict"),
pytest-forked>         (
pytest-forked>             pytest.param(True, True, id="strict xfail"),
pytest-forked>             pytest.param(False, True, id="strict xpass"),
pytest-forked>             pytest.param(True, False, id="non-strict xfail"),
pytest-forked>             pytest.param(False, False, id="non-strict xpass"),
pytest-forked>         ),
pytest-forked>     )
pytest-forked>     def test_xfail(is_crashing, is_strict, testdir):
pytest-forked>         """Test xfail/xpass/strict permutations."""
pytest-forked>         # pylint: disable=possibly-unused-variable
pytest-forked>         sig_num = signal.SIGTERM.numerator
pytest-forked>     
pytest-forked>         test_func_body = (
pytest-forked>             "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
pytest-forked>         )
pytest-forked>     
pytest-forked>         if is_crashing:
pytest-forked>             # marked xfailed and crashing, no matter strict or not
pytest-forked>             expected_letter = "x"  # XFAILED
pytest-forked>             expected_lowercase = "xfailed"
pytest-forked>             expected_word = "XFAIL"
pytest-forked>         elif is_strict:
pytest-forked>             # strict and not failing as expected should cause failure
pytest-forked>             expected_letter = "F"  # FAILED
pytest-forked>             expected_lowercase = "failed"
pytest-forked>             expected_word = FAILED_WORD
pytest-forked>         elif not is_strict:
pytest-forked>             # non-strict and not failing as expected should cause xpass
pytest-forked>             expected_letter = "X"  # XPASS
pytest-forked>             expected_lowercase = "xpassed"
pytest-forked>             expected_word = "XPASS"
pytest-forked>     
pytest-forked>         session_start_title = "*==== test session starts ====*"
pytest-forked>         loaded_pytest_plugins = "plugins: forked*"
pytest-forked>         collected_tests_num = "collected 1 item"
pytest-forked>         expected_progress = "test_xfail.py {expected_letter!s}*".format(**locals())
pytest-forked>         failures_title = "*==== FAILURES ====*"
pytest-forked>         failures_test_name = "*____ test_function ____*"
pytest-forked>         failures_test_reason = "[XPASS(strict)] The process gets terminated"
pytest-forked>         short_test_summary_title = "*==== short test summary info ====*"
pytest-forked>         short_test_summary = "{expected_word!s} test_xfail.py::test_function".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed":
pytest-forked>             # XPASS wouldn't have the crash message from
pytest-forked>             # pytest-forked because the crash doesn't happen
pytest-forked>             short_test_summary = " ".join(
pytest-forked>                 (
pytest-forked>                     short_test_summary,
pytest-forked>                     "The process gets terminated",
pytest-forked>                 )
pytest-forked>             )
pytest-forked>         reason_string = (
pytest-forked>             "  reason: The process gets terminated; "
pytest-forked>             "pytest-forked reason: "
pytest-forked>             "*:*: running the test CRASHED with signal {sig_num:d}".format(**locals())
pytest-forked>         )
pytest-forked>         total_summary_line = "*==== 1 {expected_lowercase!s} in 0.*s* ====*".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>     
pytest-forked>         expected_lines = (
pytest-forked>             session_start_title,
pytest-forked>             loaded_pytest_plugins,
pytest-forked>             collected_tests_num,
pytest-forked>             expected_progress,
pytest-forked>         )
pytest-forked>         if expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (
pytest-forked>                 failures_title,
pytest-forked>                 failures_test_name,
pytest-forked>                 failures_test_reason,
pytest-forked>             )
pytest-forked>         expected_lines += (
pytest-forked>             short_test_summary_title,
pytest-forked>             short_test_summary,
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (reason_string,)
pytest-forked>         expected_lines += (total_summary_line,)
pytest-forked>     
pytest-forked>         test_module = testdir.makepyfile(
pytest-forked>             f"""
pytest-forked>             import os
pytest-forked>             import signal
pytest-forked>     
pytest-forked>             import pytest
pytest-forked>     
pytest-forked>             # The current implementation emits RuntimeWarning.
pytest-forked>             pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
pytest-forked>     
pytest-forked>             @pytest.mark.xfail(
pytest-forked>                 reason='The process gets terminated',
pytest-forked>                 strict={is_strict!s},
pytest-forked>             )
pytest-forked>             @pytest.mark.forked
pytest-forked>             def test_function():
pytest-forked>                 {test_func_body!s}
pytest-forked>             """
pytest-forked>         )
pytest-forked>     
pytest-forked>         pytest_run_result = testdir.runpytest(test_module, "-ra")
pytest-forked> >       pytest_run_result.stdout.fnmatch_lines(expected_lines)
pytest-forked> E       Failed: fnmatch: '*==== test session starts ====*'
pytest-forked> E          with: '============================= test session starts =============================='
pytest-forked> E       nomatch: 'plugins: forked*'
pytest-forked> E           and: 'platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0'
pytest-forked> E           and: 'rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail0'
pytest-forked> E       fnmatch: 'plugins: forked*'
pytest-forked> E          with: 'plugins: forked-1.4.0'
pytest-forked> E       exact match: 'collected 1 item'
pytest-forked> E       nomatch: 'test_xfail.py x*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: 'test_xfail.py x*'
pytest-forked> E          with: 'test_xfail.py x                                                          [100%]'
pytest-forked> E       nomatch: '*==== short test summary info ====*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: '*==== short test summary info ====*'
pytest-forked> E          with: '=========================== short test summary info ============================'
pytest-forked> E       nomatch: 'XFAIL test_xfail.py::test_function'
pytest-forked> E           and: 'XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
pytest-forked> E           and: '============================== 1 xfailed in 0.01s =============================='
pytest-forked> E       remains unmatched: 'XFAIL test_xfail.py::test_function'
pytest-forked> 
pytest-forked> /build/pytest-forked-1.4.0/testing/test_xfail_behavior.py:122: Failed
pytest-forked> ----------------------------- Captured stdout call -----------------------------
pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0
pytest-forked> rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail0
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 1 item
pytest-forked> 
pytest-forked> test_xfail.py x                                                          [100%]
pytest-forked> 
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
pytest-forked> ============================== 1 xfailed in 0.01s ==============================
pytest-forked> _________________________ test_xfail[non-strict xfail] _________________________
pytest-forked> 
pytest-forked> is_crashing = True, is_strict = False
pytest-forked> testdir = <Testdir local('/build/pytest-of-nixbld/pytest-0/test_xfail2')>
pytest-forked> 
pytest-forked>     @pytest.mark.parametrize(
pytest-forked>         ("is_crashing", "is_strict"),
pytest-forked>         (
pytest-forked>             pytest.param(True, True, id="strict xfail"),
pytest-forked>             pytest.param(False, True, id="strict xpass"),
pytest-forked>             pytest.param(True, False, id="non-strict xfail"),
pytest-forked>             pytest.param(False, False, id="non-strict xpass"),
pytest-forked>         ),
pytest-forked>     )
pytest-forked>     def test_xfail(is_crashing, is_strict, testdir):
pytest-forked>         """Test xfail/xpass/strict permutations."""
pytest-forked>         # pylint: disable=possibly-unused-variable
pytest-forked>         sig_num = signal.SIGTERM.numerator
pytest-forked>     
pytest-forked>         test_func_body = (
pytest-forked>             "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
pytest-forked>         )
pytest-forked>     
pytest-forked>         if is_crashing:
pytest-forked>             # marked xfailed and crashing, no matter strict or not
pytest-forked>             expected_letter = "x"  # XFAILED
pytest-forked>             expected_lowercase = "xfailed"
pytest-forked>             expected_word = "XFAIL"
pytest-forked>         elif is_strict:
pytest-forked>             # strict and not failing as expected should cause failure
pytest-forked>             expected_letter = "F"  # FAILED
pytest-forked>             expected_lowercase = "failed"
pytest-forked>             expected_word = FAILED_WORD
pytest-forked>         elif not is_strict:
pytest-forked>             # non-strict and not failing as expected should cause xpass
pytest-forked>             expected_letter = "X"  # XPASS
pytest-forked>             expected_lowercase = "xpassed"
pytest-forked>             expected_word = "XPASS"
pytest-forked>     
pytest-forked>         session_start_title = "*==== test session starts ====*"
pytest-forked>         loaded_pytest_plugins = "plugins: forked*"
pytest-forked>         collected_tests_num = "collected 1 item"
pytest-forked>         expected_progress = "test_xfail.py {expected_letter!s}*".format(**locals())
pytest-forked>         failures_title = "*==== FAILURES ====*"
pytest-forked>         failures_test_name = "*____ test_function ____*"
pytest-forked>         failures_test_reason = "[XPASS(strict)] The process gets terminated"
pytest-forked>         short_test_summary_title = "*==== short test summary info ====*"
pytest-forked>         short_test_summary = "{expected_word!s} test_xfail.py::test_function".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed":
pytest-forked>             # XPASS wouldn't have the crash message from
pytest-forked>             # pytest-forked because the crash doesn't happen
pytest-forked>             short_test_summary = " ".join(
pytest-forked>                 (
pytest-forked>                     short_test_summary,
pytest-forked>                     "The process gets terminated",
pytest-forked>                 )
pytest-forked>             )
pytest-forked>         reason_string = (
pytest-forked>             "  reason: The process gets terminated; "
pytest-forked>             "pytest-forked reason: "
pytest-forked>             "*:*: running the test CRASHED with signal {sig_num:d}".format(**locals())
pytest-forked>         )
pytest-forked>         total_summary_line = "*==== 1 {expected_lowercase!s} in 0.*s* ====*".format(
pytest-forked>             **locals()
pytest-forked>         )
pytest-forked>     
pytest-forked>         expected_lines = (
pytest-forked>             session_start_title,
pytest-forked>             loaded_pytest_plugins,
pytest-forked>             collected_tests_num,
pytest-forked>             expected_progress,
pytest-forked>         )
pytest-forked>         if expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (
pytest-forked>                 failures_title,
pytest-forked>                 failures_test_name,
pytest-forked>                 failures_test_reason,
pytest-forked>             )
pytest-forked>         expected_lines += (
pytest-forked>             short_test_summary_title,
pytest-forked>             short_test_summary,
pytest-forked>         )
pytest-forked>         if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
pytest-forked>             # XPASS(strict)
pytest-forked>             expected_lines += (reason_string,)
pytest-forked>         expected_lines += (total_summary_line,)
pytest-forked>     
pytest-forked>         test_module = testdir.makepyfile(
pytest-forked>             f"""
pytest-forked>             import os
pytest-forked>             import signal
pytest-forked>     
pytest-forked>             import pytest
pytest-forked>     
pytest-forked>             # The current implementation emits RuntimeWarning.
pytest-forked>             pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
pytest-forked>     
pytest-forked>             @pytest.mark.xfail(
pytest-forked>                 reason='The process gets terminated',
pytest-forked>                 strict={is_strict!s},
pytest-forked>             )
pytest-forked>             @pytest.mark.forked
pytest-forked>             def test_function():
pytest-forked>                 {test_func_body!s}
pytest-forked>             """
pytest-forked>         )
pytest-forked>     
pytest-forked>         pytest_run_result = testdir.runpytest(test_module, "-ra")
pytest-forked> >       pytest_run_result.stdout.fnmatch_lines(expected_lines)
pytest-forked> E       Failed: fnmatch: '*==== test session starts ====*'
pytest-forked> E          with: '============================= test session starts =============================='
pytest-forked> E       nomatch: 'plugins: forked*'
pytest-forked> E           and: 'platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0'
pytest-forked> E           and: 'rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail2'
pytest-forked> E       fnmatch: 'plugins: forked*'
pytest-forked> E          with: 'plugins: forked-1.4.0'
pytest-forked> E       exact match: 'collected 1 item'
pytest-forked> E       nomatch: 'test_xfail.py x*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: 'test_xfail.py x*'
pytest-forked> E          with: 'test_xfail.py x                                                          [100%]'
pytest-forked> E       nomatch: '*==== short test summary info ====*'
pytest-forked> E           and: ''
pytest-forked> E       fnmatch: '*==== short test summary info ====*'
pytest-forked> E          with: '=========================== short test summary info ============================'
pytest-forked> E       nomatch: 'XFAIL test_xfail.py::test_function'
pytest-forked> E           and: 'XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
pytest-forked> E           and: '============================== 1 xfailed in 0.01s =============================='
pytest-forked> E       remains unmatched: 'XFAIL test_xfail.py::test_function'
pytest-forked> 
pytest-forked> /build/pytest-forked-1.4.0/testing/test_xfail_behavior.py:122: Failed
pytest-forked> ----------------------------- Captured stdout call -----------------------------
pytest-forked> ============================= test session starts ==============================
pytest-forked> platform linux -- Python 3.10.9, pytest-7.2.0, pluggy-1.0.0
pytest-forked> rootdir: /build/pytest-of-nixbld/pytest-0/test_xfail2
pytest-forked> plugins: forked-1.4.0
pytest-forked> collected 1 item
pytest-forked> 
pytest-forked> test_xfail.py x                                                          [100%]
pytest-forked> 
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL test_xfail.py::test_function - reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
pytest-forked> ============================== 1 xfailed in 0.01s ==============================
pytest-forked> =========================== short test summary info ============================
pytest-forked> XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys] - capture cleanup needed
pytest-forked> XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd] - capture cleanup needed
pytest-forked> ==================== 2 failed, 6 passed, 2 xfailed in 0.41s ====================
pytest-forked> /nix/store/a45y55p9zjyxvzqyvk8l1diy0c3jrb03-stdenv-linux/setup: line 1570: pop_var_context: head of shell_variables not a function context

TypeError: __init__() takes exactly 6 arguments (3 given)

Getting this traceback...does pytest-forked need to be updated due to pytest-dev/pytest@847eace ?

INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/_pytest/main.py", line 203, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/_pytest/main.py", line 243, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/hooks.py", line 284, in __call__
INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/manager.py", line 68, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/manager.py", line 62, in <lambda>
INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 208, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 81, in get_result
INTERNALERROR>     _reraise(*ex)  # noqa
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 187, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/_pytest/main.py", line 264, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/hooks.py", line 284, in __call__
INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/manager.py", line 68, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/manager.py", line 62, in <lambda>
INTERNALERROR>     firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 208, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 81, in get_result
INTERNALERROR>     _reraise(*ex)  # noqa
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pluggy/callers.py", line 187, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pytest_forked/__init__.py", line 35, in pytest_runtest_protocol
INTERNALERROR>     reports = forked_run_report(item)
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pytest_forked/__init__.py", line 63, in forked_run_report
INTERNALERROR>     return [report_process_crash(item, result)]
INTERNALERROR>   File "/home/circleci/repo/venv/lib/python2.7/site-packages/pytest_forked/__init__.py", line 71, in report_process_crash
INTERNALERROR>     call = runner.CallInfo(lambda: 0/0, "???")
INTERNALERROR> TypeError: __init__() takes exactly 6 arguments (3 given)

pytest-xdist-1.0.2 installation fails with ImportError: No module named 'setuptools'

Since a recent pytest update, pip3 install pytest pytest-xdist fails in an Ubuntu 16.04 image on Travis (macOS and Windows images works fine).

Example log:

$ pip3 install pytest pytest-xdist
Collecting pytest
  Downloading https://files.pythonhosted.org/packages/11/e9/dc9a7269a7e1fed46de7d5864da6a86370256c791bf263dd0c7d7e8f1ff1/pytest-4.2.1-py2.py3-none-any.whl (218kB)
    100% |████████████████████████████████| 225kB 4.0MB/s 
Collecting pytest-xdist
  Downloading https://files.pythonhosted.org/packages/fb/24/b71217c2083beca1f3b1fb8b32030c68ef189a718034bafe6b5895daf6a8/pytest_xdist-1.26.1-py2.py3-none-any.whl
Collecting six>=1.10.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting atomicwrites>=1.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting setuptools (from pytest)
  Downloading https://files.pythonhosted.org/packages/d1/6a/4b2fcefd2ea0868810e92d519dacac1ddc64a2e53ba9e3422c3b62b378a6/setuptools-40.8.0-py2.py3-none-any.whl (575kB)
    100% |████████████████████████████████| 583kB 2.3MB/s 
Collecting more-itertools>=4.0.0; python_version > "2.7" (from pytest)
  Downloading https://files.pythonhosted.org/packages/ae/d4/d6bad4844831943dd667510947712750004525c5807711982f4ec390da2b/more_itertools-6.0.0-py3-none-any.whl (52kB)
    100% |████████████████████████████████| 61kB 9.0MB/s 
Collecting pluggy>=0.7 (from pytest)
  Downloading https://files.pythonhosted.org/packages/2d/60/f58d9e8197f911f9405bf7e02227b43a2acc2c2f1a8cbb1be5ecf6bfd0b8/pluggy-0.8.1-py2.py3-none-any.whl
Collecting py>=1.5.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/3e/c7/3da685ef117d42ac8d71af525208759742dd235f8094221fdaafcd3dba8f/py-1.7.0-py2.py3-none-any.whl (83kB)
    100% |████████████████████████████████| 92kB 8.8MB/s 
Collecting pathlib2>=2.2.0; python_version < "3.6" (from pytest)
  Downloading https://files.pythonhosted.org/packages/2a/46/c696dcf1c7aad917b39b875acdc5451975e3a9b4890dca8329983201c97a/pathlib2-2.3.3-py2.py3-none-any.whl
Collecting attrs>=17.4.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl
Collecting pytest-forked (from pytest-xdist)
  Downloading https://files.pythonhosted.org/packages/30/be/cb5dc4f0fa5ba121943305f4f235dc1a30fae53daac20094ab89f4618578/pytest-forked-1.0.2.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
    ImportError: No module named 'setuptools'
    
    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-pov9bhcq/pytest-forked/
You are using pip version 8.1.1, however version 19.0.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

Indeed, installing python3-pip via apt did recommend python3-setuptools, but it did not actually install it.

The output for the last working version is:

$ pip3 install pytest pytest-xdist
Collecting pytest
  Downloading https://files.pythonhosted.org/packages/11/e9/dc9a7269a7e1fed46de7d5864da6a86370256c791bf263dd0c7d7e8f1ff1/pytest-4.2.1-py2.py3-none-any.whl (218kB)
    100% |████████████████████████████████| 225kB 4.0MB/s 
Collecting pytest-xdist
  Downloading https://files.pythonhosted.org/packages/fb/24/b71217c2083beca1f3b1fb8b32030c68ef189a718034bafe6b5895daf6a8/pytest_xdist-1.26.1-py2.py3-none-any.whl
Collecting more-itertools>=4.0.0; python_version > "2.7" (from pytest)
  Downloading https://files.pythonhosted.org/packages/ae/d4/d6bad4844831943dd667510947712750004525c5807711982f4ec390da2b/more_itertools-6.0.0-py3-none-any.whl (52kB)
    100% |████████████████████████████████| 61kB 10.6MB/s 
Collecting six>=1.10.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting pluggy>=0.7 (from pytest)
  Downloading https://files.pythonhosted.org/packages/2d/60/f58d9e8197f911f9405bf7e02227b43a2acc2c2f1a8cbb1be5ecf6bfd0b8/pluggy-0.8.1-py2.py3-none-any.whl
Collecting attrs>=17.4.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl
Collecting setuptools (from pytest)
  Downloading https://files.pythonhosted.org/packages/d1/6a/4b2fcefd2ea0868810e92d519dacac1ddc64a2e53ba9e3422c3b62b378a6/setuptools-40.8.0-py2.py3-none-any.whl (575kB)
    100% |████████████████████████████████| 583kB 2.4MB/s 
Collecting atomicwrites>=1.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl
Collecting pathlib2>=2.2.0; python_version < "3.6" (from pytest)
  Downloading https://files.pythonhosted.org/packages/2a/46/c696dcf1c7aad917b39b875acdc5451975e3a9b4890dca8329983201c97a/pathlib2-2.3.3-py2.py3-none-any.whl
Collecting py>=1.5.0 (from pytest)
  Downloading https://files.pythonhosted.org/packages/3e/c7/3da685ef117d42ac8d71af525208759742dd235f8094221fdaafcd3dba8f/py-1.7.0-py2.py3-none-any.whl (83kB)
    100% |████████████████████████████████| 92kB 9.2MB/s 
Collecting pytest-forked (from pytest-xdist)
  Downloading https://files.pythonhosted.org/packages/b0/2a/99ec2a8a26c0a911f76ec2a9461ff0e0d97b5b7bd3282de57eca947f39fc/pytest_forked-1.0.1-py2.py3-none-any.whl
Collecting execnet>=1.1 (from pytest-xdist)
  Downloading https://files.pythonhosted.org/packages/f9/76/3343e69a2a1602052f587898934e5fea395d22310d39c07955596597227c/execnet-1.5.0-py2.py3-none-any.whl
Collecting apipkg>=1.4 (from execnet>=1.1->pytest-xdist)
  Downloading https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: more-itertools, six, pluggy, attrs, setuptools, atomicwrites, pathlib2, py, pytest, pytest-forked, apipkg, execnet, pytest-xdist
Successfully installed apipkg atomicwrites attrs execnet more-itertools pathlib2 pluggy py pytest pytest-forked pytest-xdist setuptools six-1.10.0
You are using pip version 8.1.1, however version 19.0.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

Silent failures when a worker dies (e.g. segfault) and using forked and xdist

The following test case fails when both --forked and xdist (-n, 2) is used. It passes if only --forked is used.

import pytest


def test_segfault_reported_as_failure(testdir):
    p1 = testdir.makepyfile("""
    import ctypes

    def test_with_segfault():
        ctypes.string_at(0)

    def test_that_passes():
        assert True
    """)
    result = testdir.runpytest(p1, '--forked', '-n', 2)  # this doesn't fail if we remove '-n', 2
    result.stdout.fnmatch_lines([
        "*1 failed*"
    ])

with the following output:

======================================== test session starts =========================================
platform linux -- Python 3.7.5, pytest-5.2.1, py-1.8.0, pluggy-0.13.0
rootdir: /home
plugins: xdist-1.30.1.dev1+g79dd52b, forked-1.1.3
collected 1 item

home/test_segfault.py Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/_pytest/pytester.py", line 863, in runpytest_inprocess
    reprec = self.inline_run(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/_pytest/pytester.py", line 829, in inline_run
    ret = pytest.main(list(args), plugins=plugins)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 71, in main
    config = _prepareconfig(args, plugins)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 221, in _prepareconfig
    pluginmanager=pluginmanager, args=args
  File "/usr/local/lib/python3.7/site-packages/pluggy/hooks.py", line 286, in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
  File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", line 92, in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
  File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", line 86, in <lambda>
    firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", line 203, in _multicall
    gen.send(outcome)
  File "/usr/local/lib/python3.7/site-packages/_pytest/helpconfig.py", line 89, in pytest_cmdline_parse
    config = outcome.get_result()
  File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", line 80, in get_result
    raise ex[1].with_traceback(ex[2])
  File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", line 187, in _multicall
    res = hook_impl.function(*args)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 736, in pytest_cmdline_parse
    self.parse(args)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 943, in parse
    self._preparse(args, addopts=addopts)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 878, in _preparse
    self._initini(args)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/__init__.py", line 803, in _initini
    args, namespace=copy.copy(self.option)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/argparsing.py", line 128, in parse_known_and_unknown_args
    return optparser.parse_known_args(args, namespace=namespace)
  File "/usr/local/lib/python3.7/argparse.py", line 1787, in parse_known_args
    namespace, args = self._parse_known_args(args, namespace)
  File "/usr/local/lib/python3.7/argparse.py", line 1828, in _parse_known_args
    option_tuple = self._parse_optional(arg_string)
  File "/usr/local/lib/python3.7/site-packages/_pytest/config/argparsing.py", line 377, in _parse_optional
    if not arg_string[0] in self.prefix_chars:
TypeError: 'int' object is not subscriptable
F

My execution environment is in the following docker container:

FROM python:3.7-alpine

RUN apk add --no-cache --virtual .build-deps git

RUN pip3 install git+https://github.com/pytest-dev/pytest-forked

RUN pip3 install git+https://github.com/pytest-dev/pytest-xdist

PyPI release ?

There isn't any PyPI release for this plugin yet ?

I am currently writing a custom importer, and it would be useful to be able to fork a different process for each test to make sure tests do not conflict via sys.modules...

test suite fails `hypothesis` is in the environment (a test dependency of pytest)

Hi,

I'm trying to update this package to 1.3.0 on GNU Guix, but I'm encountering the following test failures:

============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /gnu/store/f8s95qc6dfhl0r45m70hczw5zip0xjxq-python-wrapper-3.8.2/bin/python
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/.hypothesis/examples')
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source, configfile: tox.ini
plugins: hypothesis-5.4.1, forked-1.1.3
collecting ... collected 10 items

testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (c...) [ 40%]
testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (ca...) [ 50%]
testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
testing/test_xfail_behavior.py::test_xfail[strict xfail] FAILED          [ 70%]
testing/test_xfail_behavior.py::test_xfail[strict xpass] FAILED          [ 80%]
testing/test_xfail_behavior.py::test_xfail[non-strict xfail] FAILED      [ 90%]
testing/test_xfail_behavior.py::test_xfail[non-strict xpass] FAILED      [100%]

=================================== FAILURES ===================================
___________________________ test_xfail[strict xfail] ___________________________

is_crashing = True, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
___________________________ test_xfail[strict xpass] ___________________________

is_crashing = False, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py F                                                          [100%]'
E           and: ''
E           and: '=================================== FAILURES ==================================='
E           and: '________________________________ test_function _________________________________'
E           and: '[XPASS(strict)] The process gets terminated'
E           and: '=========================== short test summary info ============================'
E           and: 'FAILED test_xfail.py::test_function'
E           and: '============================== 1 failed in 0.03s ==============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py F                                                          [100%]

=================================== FAILURES ===================================
________________________________ test_function _________________________________
[XPASS(strict)] The process gets terminated
=========================== short test summary info ============================
FAILED test_xfail.py::test_function
============================== 1 failed in 0.03s ===============================
_________________________ test_xfail[non-strict xfail] _________________________

is_crashing = True, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
_________________________ test_xfail[non-strict xpass] _________________________

is_crashing = False, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py X                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XPASS test_xfail.py::test_function The process gets terminated'
E           and: '============================== 1 xpassed in 0.04s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py X                                                          [100%]

=========================== short test summary info ============================
XPASS test_xfail.py::test_function The process gets terminated
============================== 1 xpassed in 0.04s ==============================
=========================== short test summary info ============================
FAILED testing/test_xfail_behavior.py::test_xfail[strict xfail] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xfail] - Failed:...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed:...
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
==================== 4 failed, 4 passed, 2 xfailed in 1.62s ====================

The only direct dependency is pytest 6.2.4, but if you'd like to see the whole list of transitive dependencies used, here it is:

$ ./pre-inst-env guix refresh --list-transitive python-pytest-forked
[email protected] depends on the following 91 packages: [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] tzdata@2019c [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] ld-wrapper@0 [email protected] [email protected] [email protected] [email protected]

Allow marked tests to be skipped under Windows

Currently, if the test is marked with forked, this plugin tries to run it in a forked subprocess regardless of the current env. Also, adding a conditional skip marker does not help and manifests itself as pytest-dev/pytest#7327.

The following snippet still runs on Windows:

@pytest.mark.skipif(IS_WINDOWS, ...)
@pytest.mark.forked
def test():
    ...

The workaround that I currently use is:

@pytest.mark.skipif(IS_WINDOWS, ...)
def test():
    ...

if not IS_WINDOWS:
    test = pytest.mark.forked(test)

This could be addressed in a few ways:

  1. Actually respect skip markers.
  2. Skip any tests marked with forked when the env is not supported.
  3. Run these tests but w/o trying to sandbox them.

I think I like 1+2 most.

Test failures with Python 3.12 due to warnings (from pytest)

While technically the "problem" is in pytest itself, I think it'd be better if the test suite would be more resilient to warnings that don't affect the result.

========================================================= test session starts =========================================================
platform linux -- Python 3.12.0b1, pytest-7.3.1, pluggy-1.0.0
cachedir: .tox/py312/.pytest_cache
rootdir: /tmp/pytest-forked
configfile: tox.ini
plugins: forked-1.6.1.dev4+gd9d05e2
collected 10 items                                                                                                                    

testing/test_boxed.py ...xx.                                                                                                    [ 60%]
testing/test_xfail_behavior.py .F.F                                                                                             [100%]

============================================================== FAILURES ===============================================================
______________________________________________________ test_xfail[strict xpass] _______________________________________________________

is_crashing = False, is_strict = True, testdir = <Testdir local('/tmp/pytest-of-mgorny/pytest-2/test_xfail1')>

    @pytest.mark.parametrize(
        ("is_crashing", "is_strict"),
        (
            pytest.param(True, True, id="strict xfail"),
            pytest.param(False, True, id="strict xpass"),
            pytest.param(True, False, id="non-strict xfail"),
            pytest.param(False, False, id="non-strict xpass"),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = "x"  # XFAILED
            expected_lowercase = "xfailed"
            expected_word = "XFAIL"
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = "F"  # FAILED
            expected_lowercase = "failed"
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = "X"  # XPASS
            expected_lowercase = "xpassed"
            expected_word = "XPASS"
    
        session_start_title = "*==== test session starts ====*"
        loaded_pytest_plugins = "plugins: forked*"
        collected_tests_num = "collected 1 item"
        expected_progress = f"test_xfail.py {expected_letter!s}*"
        failures_title = "*==== FAILURES ====*"
        failures_test_name = "*____ test_function ____*"
        failures_test_reason = "[XPASS(strict)] The process gets terminated"
        short_test_summary_title = "*==== short test summary info ====*"
        short_test_summary = f"{expected_word!s} test_xfail.py::test_function"
        if expected_lowercase == "xpassed":
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = " ".join(
                (
                    short_test_summary,
                    "The process gets terminated",
                )
            )
        reason_string = (
            f"reason: The process gets terminated; "
            f"pytest-forked reason: "
            f"*:*: running the test CRASHED with signal {sig_num:d}"
        )
        if expected_lowercase == "xfailed" and PYTEST_GTE_7_2:
            short_test_summary += " - " + reason_string
        total_summary_line = f"*==== 1 {expected_lowercase!s} in 0.*s* ====*"
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += ("  " + reason_string,)
        expected_lines += (total_summary_line,)
    
        test_module = testdir.makepyfile(
            f"""
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """
        )
    
        pytest_run_result = testdir.runpytest(test_module, "-ra")
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '========================================================= test session starts ========================================================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.12.0b1, pytest-7.3.1, pluggy-1.0.0'
E           and: 'rootdir: /tmp/pytest-of-mgorny/pytest-2/test_xfail1'
E       fnmatch: 'plugins: forked*'
E          with: 'plugins: forked-1.6.1.dev4+gd9d05e2'
E       exact match: 'collected 1 item'
E       nomatch: 'test_xfail.py F*'
E           and: ''
E       fnmatch: 'test_xfail.py F*'
E          with: 'test_xfail.py F                                                                                                                 [100%]'
E       nomatch: '*==== FAILURES ====*'
E           and: ''
E       fnmatch: '*==== FAILURES ====*'
E          with: '============================================================== FAILURES ==============================================================='
E       fnmatch: '*____ test_function ____*'
E          with: '____________________________________________________________ test_function ____________________________________________________________'
E       exact match: '[XPASS(strict)] The process gets terminated'
E       nomatch: '*==== short test summary info ====*'
E           and: '========================================================== warnings summary ==========================================================='
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    assertmsg = ast.Str("")'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    keys = [ast.Str(key) for key in current.keys()]'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941: DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    clear = ast.Assign(variables, ast.NameConstant(None))'
E           and: ''
E           and: '-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html'
E       fnmatch: '*==== short test summary info ====*'
E          with: '======================================================= short test summary info ======================================================='
E       exact match: 'FAILED test_xfail.py::test_function'
E       nomatch: '*==== 1 failed in 0.*s* ====*'
E           and: '==================================================== 1 failed, 4 warnings in 0.01s ===================================================='
E       remains unmatched: '*==== 1 failed in 0.*s* ====*'

/tmp/pytest-forked/testing/test_xfail_behavior.py:121: Failed
-------------------------------------------------------- Captured stdout call ---------------------------------------------------------
========================================================= test session starts =========================================================
platform linux -- Python 3.12.0b1, pytest-7.3.1, pluggy-1.0.0
rootdir: /tmp/pytest-of-mgorny/pytest-2/test_xfail1
plugins: forked-1.6.1.dev4+gd9d05e2
collected 1 item

test_xfail.py F                                                                                                                 [100%]

============================================================== FAILURES ===============================================================
____________________________________________________________ test_function ____________________________________________________________
[XPASS(strict)] The process gets terminated
========================================================== warnings summary ===========================================================
../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    assertmsg = ast.Str("")

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    keys = [ast.Str(key) for key in current.keys()]

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941: DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use ast.Constant instead
    clear = ast.Assign(variables, ast.NameConstant(None))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================================= short test summary info =======================================================
FAILED test_xfail.py::test_function
==================================================== 1 failed, 4 warnings in 0.01s ====================================================
____________________________________________________ test_xfail[non-strict xpass] _____________________________________________________

is_crashing = False, is_strict = False, testdir = <Testdir local('/tmp/pytest-of-mgorny/pytest-2/test_xfail3')>

    @pytest.mark.parametrize(
        ("is_crashing", "is_strict"),
        (
            pytest.param(True, True, id="strict xfail"),
            pytest.param(False, True, id="strict xpass"),
            pytest.param(True, False, id="non-strict xfail"),
            pytest.param(False, False, id="non-strict xpass"),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            "os.kill(os.getpid(), signal.SIGTERM)" if is_crashing else "assert True"
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = "x"  # XFAILED
            expected_lowercase = "xfailed"
            expected_word = "XFAIL"
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = "F"  # FAILED
            expected_lowercase = "failed"
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = "X"  # XPASS
            expected_lowercase = "xpassed"
            expected_word = "XPASS"
    
        session_start_title = "*==== test session starts ====*"
        loaded_pytest_plugins = "plugins: forked*"
        collected_tests_num = "collected 1 item"
        expected_progress = f"test_xfail.py {expected_letter!s}*"
        failures_title = "*==== FAILURES ====*"
        failures_test_name = "*____ test_function ____*"
        failures_test_reason = "[XPASS(strict)] The process gets terminated"
        short_test_summary_title = "*==== short test summary info ====*"
        short_test_summary = f"{expected_word!s} test_xfail.py::test_function"
        if expected_lowercase == "xpassed":
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = " ".join(
                (
                    short_test_summary,
                    "The process gets terminated",
                )
            )
        reason_string = (
            f"reason: The process gets terminated; "
            f"pytest-forked reason: "
            f"*:*: running the test CRASHED with signal {sig_num:d}"
        )
        if expected_lowercase == "xfailed" and PYTEST_GTE_7_2:
            short_test_summary += " - " + reason_string
        total_summary_line = f"*==== 1 {expected_lowercase!s} in 0.*s* ====*"
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == "xpassed" and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += ("  " + reason_string,)
        expected_lines += (total_summary_line,)
    
        test_module = testdir.makepyfile(
            f"""
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """
        )
    
        pytest_run_result = testdir.runpytest(test_module, "-ra")
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '========================================================= test session starts ========================================================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.12.0b1, pytest-7.3.1, pluggy-1.0.0'
E           and: 'rootdir: /tmp/pytest-of-mgorny/pytest-2/test_xfail3'
E       fnmatch: 'plugins: forked*'
E          with: 'plugins: forked-1.6.1.dev4+gd9d05e2'
E       exact match: 'collected 1 item'
E       nomatch: 'test_xfail.py X*'
E           and: ''
E       fnmatch: 'test_xfail.py X*'
E          with: 'test_xfail.py X                                                                                                                 [100%]'
E       nomatch: '*==== short test summary info ====*'
E           and: ''
E           and: '========================================================== warnings summary ==========================================================='
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    assertmsg = ast.Str("")'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    keys = [ast.Str(key) for key in current.keys()]'
E           and: ''
E           and: '../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941'
E           and: '  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941: DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use ast.Constant instead'
E           and: '    clear = ast.Assign(variables, ast.NameConstant(None))'
E           and: ''
E           and: '-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html'
E       fnmatch: '*==== short test summary info ====*'
E          with: '======================================================= short test summary info ======================================================='
E       exact match: 'XPASS test_xfail.py::test_function The process gets terminated'
E       nomatch: '*==== 1 xpassed in 0.*s* ====*'
E           and: '=================================================== 1 xpassed, 4 warnings in 0.01s ===================================================='
E       remains unmatched: '*==== 1 xpassed in 0.*s* ====*'

/tmp/pytest-forked/testing/test_xfail_behavior.py:121: Failed
-------------------------------------------------------- Captured stdout call ---------------------------------------------------------
========================================================= test session starts =========================================================
platform linux -- Python 3.12.0b1, pytest-7.3.1, pluggy-1.0.0
rootdir: /tmp/pytest-of-mgorny/pytest-2/test_xfail3
plugins: forked-1.6.1.dev4+gd9d05e2
collected 1 item

test_xfail.py X                                                                                                                 [100%]

========================================================== warnings summary ===========================================================
../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    assertmsg = ast.Str("")

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    keys = [ast.Str(key) for key in current.keys()]

../../../pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941: DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use ast.Constant instead
    clear = ast.Assign(variables, ast.NameConstant(None))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================================= short test summary info =======================================================
XPASS test_xfail.py::test_function The process gets terminated
=================================================== 1 xpassed, 4 warnings in 0.01s ====================================================
========================================================== warnings summary ===========================================================
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:683
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:683
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:683
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:683: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    and isinstance(item.value, ast.Str)

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:685
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:685
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:685: DeprecationWarning: Attribute s is deprecated and will be removed in Python 3.14; use value instead
    doc = item.value.s

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:965: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    inlocs = ast.Compare(ast.Str(name.id), [ast.In()], [locs])

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:968: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    expr = ast.IfExp(test, self.display(name), ast.Str(name.id))

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1102
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1102
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1102: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    syms.append(ast.Str(sym))

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1104
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1104
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:1104: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    expls.append(ast.Str(expl))

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:817: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    keys = [ast.Str(key) for key in current.keys()]

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:927: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    assertmsg = ast.Str("")

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:929: DeprecationWarning: ast.Str is deprecated and will be removed in Python 3.14; use ast.Constant instead
    template = ast.BinOp(assertmsg, ast.Add(), ast.Str(explanation))

.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941
  /tmp/pytest-forked/.tox/py312/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:941: DeprecationWarning: ast.NameConstant is deprecated and will be removed in Python 3.14; use ast.Constant instead
    clear = ast.Assign(variables, ast.NameConstant(None))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================================= short test summary info =======================================================
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnmatch: '*==== test session starts ====*'
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed: fnmatch: '*==== test session starts ====*'
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys] - capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd] - capture cleanup needed
========================================= 2 failed, 6 passed, 2 xfailed, 37 warnings in 0.66s =========================================

Update requirements

The current requires in pyproject.toml are quite old:

requires = ['setuptools ~= 41.4', 'setuptools_scm ~= 3.3', 'wheel ~= 0.33.6']

Specifically, specifying exactly setuptools 41.4.x is problematic as it behaves quite differently to a modern setuptools.

v1.3.0 not available on conda-forge

hi, I saw the warning that maintenance is down to bare minimum but I was wondering about the process for conda-forge builds. I cannot find v1.3.0 on conda-forge, but I don't know what's required to have that build.

[FR] Integrate `xfail`

I have a test that SEGFAULTs under random envs (usually only in one or two jobs in the matrix). So I decided to use forked to isolate it and marked it with xfail as follows:

@pytest.mark.xfail(
    reason='This test causes SEGFAULT, flakily.',
    strict=False,
)
@pytest.mark.forked
def test_exec_command(ssh_channel):
    """Test getting the output of a remotely executed command."""
    ...

But, despite my expectations, when it crashes, pytest shows it as a failure and not as XFAIL.

Could you please clarify whether this behavior is intended or is it a bug? (looks like a bug to me)

Am I doing anything wrong? Am I missing something here?

(Ref: https://github.com/ansible/pylibssh/runs/728770734?check_suite_focus=true#step:9:120)

Tarball contains .pyc files

$ tar -tvf pytest-forked-0.2.tar.gz | grep \.pyc
-rw-r--r-- rpfannsc/rpfannsc  819 2017-07-28 10:27 pytest-forked-0.2/testing/conftest.pyc
drwxrwxr-x rpfannsc/rpfannsc    0 2017-08-04 14:59 pytest-forked-0.2/testing/__pycache__/
-rw-rw-r-- rpfannsc/rpfannsc  961 2017-07-28 10:27 pytest-forked-0.2/testing/__pycache__/conftest.cpython-27-PYTEST.pyc
-rw-rw-r-- rpfannsc/rpfannsc  718 2017-07-28 10:45 pytest-forked-0.2/testing/__pycache__/conftest.cpython-36-PYTEST.pyc
-rw-rw-r-- rpfannsc/rpfannsc  769 2017-07-28 10:27 pytest-forked-0.2/testing/__pycache__/conftest.cpython-35-PYTEST.pyc
-rw-rw-r-- rpfannsc/rpfannsc 2480 2017-08-04 14:28 pytest-forked-0.2/testing/__pycache__/test_boxed.cpython-27-PYTEST.pyc
-rw-r--r-- rpfannsc/rpfannsc  611 2017-08-04 14:28 pytest-forked-0.2/testing/__pycache__/conftest.cpython-36.pyc
-rw-rw-r-- rpfannsc/rpfannsc 2076 2017-08-04 14:28 pytest-forked-0.2/testing/__pycache__/test_boxed.cpython-35-PYTEST.pyc
-rw-r--r-- rpfannsc/rpfannsc  652 2017-07-28 10:27 pytest-forked-0.2/testing/__pycache__/conftest.cpython-35.pyc
-rw-rw-r-- rpfannsc/rpfannsc 1923 2017-08-04 14:06 pytest-forked-0.2/testing/__pycache__/test_boxed.cpython-36-PYTEST.pyc

Installation fails with older pip due to missing wheels

I get the error:

$ pip install pytest-forked
Collecting pytest-forked
  Using cached https://files.pythonhosted.org/packages/ae/9c/8f0c51c98ee5165ff575f196662a4a314ff07c9d3de64a94580c982edcee/pytest-forked-1.1.1.tar.gz
    ERROR: Command errored out with exit status 1:
     command: /Users/msullivan/asdf/bin/python3.5 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/mw/l4r6v5r5575dyyz8kcdqvcv0lkcl0v/T/pip-install-3pj0e0jj/pytest-forked/setup.py'"'"'; __file__='"'"'/private/var/folders/mw/l4r6v5r5575dyyz8kcdqvcv0lkcl0v/T/pip-install-3pj0e0jj/pytest-forked/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
         cwd: /private/var/folders/mw/l4r6v5r5575dyyz8kcdqvcv0lkcl0v/T/pip-install-3pj0e0jj/pytest-forked/
    Complete output (26 lines):
    Download error on https://pypi.org/simple/setuptools_scm/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:719) -- Some packages may not be found!
    Download error on https://pypi.org/simple/setuptools-scm/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:719) -- Some packages may not be found!
    Couldn't find index page for 'setuptools_scm' (maybe misspelled?)
    Download error on https://pypi.org/simple/: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:719) -- Some packages may not be found!
    No local packages or working download links found for setuptools_scm
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/private/var/folders/mw/l4r6v5r5575dyyz8kcdqvcv0lkcl0v/T/pip-install-3pj0e0jj/pytest-forked/setup.py", line 41, in <module>
        'Programming Language :: Python :: 3.7',
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/setuptools/__init__.py", line 144, in setup
        _install_setup_requires(attrs)
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/setuptools/__init__.py", line 139, in _install_setup_requires
        dist.fetch_build_eggs(dist.setup_requires)
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/setuptools/dist.py", line 720, in fetch_build_eggs
        replace_conflicting=True,
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/pkg_resources/__init__.py", line 782, in resolve
        replace_conflicting=replace_conflicting
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/pkg_resources/__init__.py", line 1065, in best_match
        return self.obtain(req, installer)
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/pkg_resources/__init__.py", line 1077, in obtain
        return installer(requirement)
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/setuptools/dist.py", line 787, in fetch_build_egg
        return cmd.easy_install(req)
      File "/Users/msullivan/asdf/lib/python3.5/site-packages/setuptools/command/easy_install.py", line 673, in easy_install
        raise DistutilsError(msg)
    distutils.errors.DistutilsError: Could not find suitable distribution for Requirement.parse('setuptools_scm')
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

It works on 3.6/3.7 and it works on linux.

I think maybe setuptools_scm needs to be specified in setup.cfg as setup_requires?

Pytest hangs when using faulthandler_timeout with pytest-forked

Steps to reproduce

Create two files in root workspace

# foo.py
def test_bar():
    fjaoisjdfoiajsdofi
# pyproject.toml
[tool.pytest.ini_options]
faulthandler_timeout=1800

Create a minimal venv with pytest-forked

mkvirtualenv foo --python 3.7
pip install pytest-forked==1.3.0

Run the following,

pytest foo.py --forked

pytest should hang indefinitely during collection

=========================================================================================== test session starts ============================================================================================
platform linux -- Python 3.7.5, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
rootdir: /home/tyler/playground/pytest-forked-repro, configfile: pyproject.toml
plugins: forked-1.3.0
collected 1 item                                                                                                                                                                                           

foo.py 

-s option ?

Great package, very useful.
I didn't succeed to use the -s option (to print the stdout during the tests).
Any workaround to use it ?
Thanks !

AttributeError: 'Function' object has no attribute '_getfslineno'

_getfslineno has been removed in pytest in pytest-dev/pytest@16546b734

===================================== test session starts ======================================
platform linux -- Python 3.8.2, pytest-5.3.5.dev883+g03dca1aeb.d20200326, py-1.8.2.dev5+g043b5868.d20200315, pluggy-0.13.2.dev20+g6e8b6d4.d20200311
rootdir: …/src/pytest, inifile: tox.ini
plugins: xdist-1.31.0, pyannotate-1.2.0, forked-1.1.3
collected 57 items / 56 deselected / 1 selected

testing/test_runner_new.py ===================================== test session starts ======================================
platform linux -- Python 3.8.2, pytest-5.3.5.dev883+g03dca1aeb.d20200326, py-1.8.2.dev5+g043b5868.d20200315, pluggy-0.13.2.dev20+g6e8b6d4.d20200311
rootdir: /tmp/pytest-of-user/pytest-114/test_boxed_suicide0
plugins: xdist-1.31.0, pyannotate-1.2.0, forked-1.1.3
collected 1 item
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "…/Vcs/pytest-forked/src/pytest_forked/__init__.py", line 75, in report_process_crash
INTERNALERROR>     from _pytest.compat import getfslineno
INTERNALERROR> ImportError: cannot import name 'getfslineno' from '_pytest.compat' (…/src/pytest/src/_pytest/compat.py)
INTERNALERROR>
INTERNALERROR> During handling of the above exception, another exception occurred:
INTERNALERROR>
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "…/src/pytest/src/_pytest/main.py", line 191, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "…/src/pytest/src/_pytest/main.py", line 247, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/hooks.py", line 292, in __call__
INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 93, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 349, in traced_hookexec
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 88, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 60, in from_call
INTERNALERROR>     result = func()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 347, in <lambda>
INTERNALERROR>     outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 84, in <lambda>
INTERNALERROR>     self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 216, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 88, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 195, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "…/src/pytest/src/_pytest/main.py", line 272, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/hooks.py", line 292, in __call__
INTERNALERROR>     return self._hookexec(self, self.get_hookimpls(), kwargs)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 93, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 349, in traced_hookexec
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 88, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 60, in from_call
INTERNALERROR>     result = func()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 347, in <lambda>
INTERNALERROR>     outcome = _Result.from_call(lambda: oldcall(hook, hook_impls, kwargs))
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/manager.py", line 84, in <lambda>
INTERNALERROR>     self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 216, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 88, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "…/Vcs/pluggy/src/pluggy/callers.py", line 195, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "…/Vcs/pytest-forked/src/pytest_forked/__init__.py", line 42, in pytest_runtest_protocol
INTERNALERROR>     reports = forked_run_report(item)
INTERNALERROR>   File "…/Vcs/pytest-forked/src/pytest_forked/__init__.py", line 70, in forked_run_report
INTERNALERROR>     return [report_process_crash(item, result)]
INTERNALERROR>   File "…/Vcs/pytest-forked/src/pytest_forked/__init__.py", line 78, in report_process_crash
INTERNALERROR>     path, lineno = item._getfslineno()
INTERNALERROR> AttributeError: 'Function' object has no attribute '_getfslineno'

==================================== no tests ran in 0.02s =====================================

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PDB set_trace >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
ENTER /tmp/pytest-of-user/pytest-114/test_boxed_suicide0 => /home/user
--Return--
[32] > …/src/pytest/testing/test_runner_new.py(439)test_boxed_suicide()->None
-> __import__('pdb').set_trace()
   5 frames hidden (try 'help hidden_frames')

`pytest_runtest_logstart` and `pytest_runtest_logfinish` are not in effect

pytest-forked reimplemented pytest_runtest_protocol, but pytest_runtest_logstart and pytest_runtest_logfinish were ignored.

Here are the codes in pytest-forked and pytest:

def pytest_runtest_protocol(item):
if item.config.getvalue("forked") or item.get_closest_marker("forked"):
reports = forked_run_report(item)
for rep in reports:
item.ihook.pytest_runtest_logreport(report=rep)
return True

https://github.com/pytest-dev/pytest/blob/259e5d0610202453f9beace6cab336b11d4353a9/src/_pytest/runner.py#L87
image

1.3.0: tox warnings and pytest is failing with `flaky` and `relaxed` extensions

Somerimes it is good to reprt even some minir warnings :)

+ cd pytest-forked-1.3.0
+ /usr/bin/python3 -Bm tox --skip-missing-interpreters
.package create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.package
.package installdeps: setuptools ~= 41.4, setuptools_scm ~= 3.3, wheel ~= 0.33.6
py27-pytest310 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py27-pytest310
SKIPPED: InterpreterNotFound: python2.7
py27-pytest46 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py27-pytest46
SKIPPED: InterpreterNotFound: python2.7
py27-pytest54 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py27-pytest54
SKIPPED: InterpreterNotFound: python2.7
py27-pytestlatest create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py27-pytestlatest
SKIPPED: InterpreterNotFound: python2.7
py35-pytest310 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py35-pytest310
SKIPPED: InterpreterNotFound: python3.5
py35-pytest46 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py35-pytest46
SKIPPED: InterpreterNotFound: python3.5
py35-pytest54 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py35-pytest54
SKIPPED: InterpreterNotFound: python3.5
py35-pytestlatest create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py35-pytestlatest
SKIPPED: InterpreterNotFound: python3.5
py36-pytest310 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py36-pytest310
SKIPPED: InterpreterNotFound: python3.6
py36-pytest46 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py36-pytest46
SKIPPED: InterpreterNotFound: python3.6
py36-pytest54 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py36-pytest54
SKIPPED: InterpreterNotFound: python3.6
py36-pytestlatest create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py36-pytestlatest
SKIPPED: InterpreterNotFound: python3.6
py37-pytest310 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py37-pytest310
SKIPPED: InterpreterNotFound: python3.7
py37-pytest46 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py37-pytest46
SKIPPED: InterpreterNotFound: python3.7
py37-pytest54 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py37-pytest54
SKIPPED: InterpreterNotFound: python3.7
py37-pytestlatest create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py37-pytestlatest
SKIPPED: InterpreterNotFound: python3.7
py38-pytest310 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py38-pytest310
py38-pytest310 installdeps: pycmd, setuptools_scm, pytest~=3.10
py38-pytest310 inst: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.tmp/package/1/pytest-forked-1.3.0.tar.gz
py38-pytest310 installed: atomicwrites==1.4.0,attrs==20.3.0,more-itertools==8.7.0,pluggy==0.13.1,py==1.10.0,pycmd==1.2,pytest==3.10.1,pytest-forked==1.3.0,setuptools-scm==6.0.1,six==1.15.0
py38-pytest310 run-test-pre: PYTHONHASHSEED='951081011'
py38-pytest310 run-test: commands[0] | pytest
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-3.10.1, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0, inifile: tox.ini
plugins: forked-1.3.0
collected 10 items

testing/test_boxed.py ...xx.                                                                                                                                         [ 60%]
testing/test_xfail_behavior.py ....                                                                                                                                  [100%]
========================================================================= short test summary info ==========================================================================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed

=================================================================== 8 passed, 2 xfailed in 0.62 seconds ====================================================================
py38-pytest46 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py38-pytest46
py38-pytest46 installdeps: pycmd, setuptools_scm, pytest~=4.6
py38-pytest46 inst: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.tmp/package/1/pytest-forked-1.3.0.tar.gz
py38-pytest46 installed: atomicwrites==1.4.0,attrs==20.3.0,more-itertools==8.7.0,packaging==20.9,pluggy==0.13.1,py==1.10.0,pycmd==1.2,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.3.0,setuptools-scm==6.0.1,six==1.15.0,wcwidth==0.2.5
py38-pytest46 run-test-pre: PYTHONHASHSEED='951081011'
py38-pytest46 run-test: commands[0] | pytest
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
cachedir: .tox/py38-pytest46/.pytest_cache
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0, inifile: tox.ini
plugins: forked-1.3.0
collected 10 items

testing/test_boxed.py ...xx.                                                                                                                                         [ 60%]
testing/test_xfail_behavior.py ....                                                                                                                                  [100%]

========================================================================= short test summary info ==========================================================================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
=================================================================== 8 passed, 2 xfailed in 0.70 seconds ====================================================================
py38-pytest54 create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py38-pytest54
py38-pytest54 installdeps: pycmd, setuptools_scm, pytest~=5.4
py38-pytest54 inst: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.tmp/package/1/pytest-forked-1.3.0.tar.gz
py38-pytest54 installed: attrs==20.3.0,more-itertools==8.7.0,packaging==20.9,pluggy==0.13.1,py==1.10.0,pycmd==1.2,pyparsing==2.4.7,pytest==5.4.3,pytest-forked==1.3.0,setuptools-scm==6.0.1,wcwidth==0.2.5
py38-pytest54 run-test-pre: PYTHONHASHSEED='951081011'
py38-pytest54 run-test: commands[0] | pytest
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-5.4.3, py-1.10.0, pluggy-0.13.1
cachedir: .tox/py38-pytest54/.pytest_cache
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0, inifile: tox.ini
plugins: forked-1.3.0
collected 10 items

testing/test_boxed.py ...xx.                                                                                                                                         [ 60%]
testing/test_xfail_behavior.py ....                                                                                                                                  [100%]

============================================================================= warnings summary =============================================================================
testing/test_boxed.py::test_functional_boxed
testing/test_boxed.py::test_functional_boxed_per_test
testing/test_boxed.py::test_functional_boxed_capturing[no]
testing/test_boxed.py::test_functional_boxed_capturing[sys]
testing/test_boxed.py::test_functional_boxed_capturing[fd]
testing/test_xfail_behavior.py::test_xfail[strict xfail]
testing/test_xfail_behavior.py::test_xfail[strict xpass]
testing/test_xfail_behavior.py::test_xfail[non-strict xfail]
testing/test_xfail_behavior.py::test_xfail[non-strict xpass]
  /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py38-pytest54/lib/python3.8/site-packages/_pytest/compat.py:333: PytestDeprecationWarning: The TerminalReporter.writer attribute is deprecated, use TerminalReporter._tw instead at your own risk.
  See https://docs.pytest.org/en/latest/deprecations.html#terminalreporter-writer for more information.
    return getattr(object, name, default)

-- Docs: https://docs.pytest.org/en/latest/warnings.html
========================================================================= short test summary info ==========================================================================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
================================================================= 8 passed, 2 xfailed, 9 warnings in 0.76s =================================================================
py38-pytestlatest create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/py38-pytestlatest
py38-pytestlatest installdeps: pycmd, setuptools_scm, pytest
py38-pytestlatest inst: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.tmp/package/1/pytest-forked-1.3.0.tar.gz
py38-pytestlatest installed: attrs==20.3.0,iniconfig==1.1.1,packaging==20.9,pluggy==0.13.1,py==1.10.0,pycmd==1.2,pyparsing==2.4.7,pytest==6.2.3,pytest-forked==1.3.0,setuptools-scm==6.0.1,toml==0.10.2
py38-pytestlatest run-test-pre: PYTHONHASHSEED='951081011'
py38-pytestlatest run-test: commands[0] | pytest
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
cachedir: .tox/py38-pytestlatest/.pytest_cache
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0, configfile: tox.ini
plugins: forked-1.3.0
collected 10 items

testing/test_boxed.py ...xx.                                                                                                                                         [ 60%]
testing/test_xfail_behavior.py ....                                                                                                                                  [100%]

========================================================================= short test summary info ==========================================================================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
======================================================================= 8 passed, 2 xfailed in 0.65s =======================================================================
flakes create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/flakes
flakes installdeps: flake8
flakes inst: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/.tmp/package/1/pytest-forked-1.3.0.tar.gz
flakes installed: attrs==20.3.0,flake8==3.9.1,iniconfig==1.1.1,mccabe==0.6.1,packaging==20.9,pluggy==0.13.1,py==1.10.0,pycodestyle==2.7.0,pyflakes==2.3.1,pyparsing==2.4.7,pytest==6.2.3,pytest-forked==1.3.0,toml==0.10.2
flakes run-test-pre: PYTHONHASHSEED='951081011'
flakes run-test: commands[0] | flake8 setup.py testing src/pytest_forked/
build-dists create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/build-dists
build-dists installdeps: pep517 >= 0.7.0
build-dists installed: pep517==0.10.0,toml==0.10.2
build-dists run-test-pre: PYTHONHASHSEED='951081011'
build-dists run-test: commands[0] | rm -rfv /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/dist/
build-dists run-test: commands[1] | /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/build-dists/bin/python -m pep517.build --source --binary --out-dir /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/dist/ /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0
pep517.build is deprecated. Consider switching to https://pypi.org/project/build/
WARNING: You are using pip version 19.3.1; however, version 21.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
WARNING: You are using pip version 19.3.1; however, version 21.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
running sdist
running egg_info
writing src/pytest_forked.egg-info/PKG-INFO
writing dependency_links to src/pytest_forked.egg-info/dependency_links.txt
writing entry points to src/pytest_forked.egg-info/entry_points.txt
writing requirements to src/pytest_forked.egg-info/requires.txt
writing top-level names to src/pytest_forked.egg-info/top_level.txt
adding license file 'LICENSE' (matched pattern 'LICEN[CS]E*')
reading manifest file 'src/pytest_forked.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.txt'
no previously-included directories found matching '.git'
writing manifest file 'src/pytest_forked.egg-info/SOURCES.txt'
running check
creating pytest-forked-1.3.0
creating pytest-forked-1.3.0/example
creating pytest-forked-1.3.0/src
creating pytest-forked-1.3.0/src/pytest_forked
creating pytest-forked-1.3.0/src/pytest_forked.egg-info
creating pytest-forked-1.3.0/testing
creating pytest-forked-1.3.0/testing/__pycache__
copying files to pytest-forked-1.3.0...
copying .gitignore -> pytest-forked-1.3.0
copying .travis.yml -> pytest-forked-1.3.0
copying CHANGELOG -> pytest-forked-1.3.0
copying LICENSE -> pytest-forked-1.3.0
copying MANIFEST.in -> pytest-forked-1.3.0
copying README.rst -> pytest-forked-1.3.0
copying pyproject.toml -> pytest-forked-1.3.0
copying setup.cfg -> pytest-forked-1.3.0
copying setup.py -> pytest-forked-1.3.0
copying tox.ini -> pytest-forked-1.3.0
copying example/boxed.txt -> pytest-forked-1.3.0/example
copying src/pytest_forked/__init__.py -> pytest-forked-1.3.0/src/pytest_forked
copying src/pytest_forked.egg-info/PKG-INFO -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/SOURCES.txt -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/dependency_links.txt -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/entry_points.txt -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/not-zip-safe -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/requires.txt -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying src/pytest_forked.egg-info/top_level.txt -> pytest-forked-1.3.0/src/pytest_forked.egg-info
copying testing/conftest.py -> pytest-forked-1.3.0/testing
copying testing/test_boxed.py -> pytest-forked-1.3.0/testing
copying testing/test_xfail_behavior.py -> pytest-forked-1.3.0/testing
copying testing/__pycache__/conftest.cpython-38-PYTEST.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/conftest.cpython-38-pytest-5.4.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/conftest.cpython-38-pytest-6.2.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_boxed.cpython-38-PYTEST.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_boxed.cpython-38-pytest-5.4.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_boxed.cpython-38-pytest-6.2.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_xfail_behavior.cpython-38-PYTEST.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_xfail_behavior.cpython-38-pytest-5.4.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
copying testing/__pycache__/test_xfail_behavior.cpython-38-pytest-6.2.3.pyc -> pytest-forked-1.3.0/testing/__pycache__
Writing pytest-forked-1.3.0/setup.cfg
Creating tar archive
removing 'pytest-forked-1.3.0' (and everything under it)
WARNING: You are using pip version 19.3.1; however, version 21.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
WARNING: You are using pip version 19.3.1; however, version 21.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
running bdist_wheel
running build
running build_py
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/wheel
creating build/bdist.linux-x86_64/wheel/pytest_forked
copying build/lib/pytest_forked/__init__.py -> build/bdist.linux-x86_64/wheel/pytest_forked
running install_egg_info
running egg_info
writing src/pytest_forked.egg-info/PKG-INFO
writing dependency_links to src/pytest_forked.egg-info/dependency_links.txt
writing entry points to src/pytest_forked.egg-info/entry_points.txt
writing requirements to src/pytest_forked.egg-info/requires.txt
writing top-level names to src/pytest_forked.egg-info/top_level.txt
adding license file 'LICENSE' (matched pattern 'LICEN[CS]E*')
reading manifest file 'src/pytest_forked.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.txt'
no previously-included directories found matching '.git'
writing manifest file 'src/pytest_forked.egg-info/SOURCES.txt'
Copying src/pytest_forked.egg-info to build/bdist.linux-x86_64/wheel/pytest_forked-1.3.0-py3.8.egg-info
running install_scripts
adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
creating build/bdist.linux-x86_64/wheel/pytest_forked-1.3.0.dist-info/WHEEL
creating '/tmp/tmp_sp8uefq/tmpj3lnod8r/pytest_forked-1.3.0-py2.py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
adding 'pytest_forked/__init__.py'
adding 'pytest_forked-1.3.0.dist-info/LICENSE'
adding 'pytest_forked-1.3.0.dist-info/METADATA'
adding 'pytest_forked-1.3.0.dist-info/WHEEL'
adding 'pytest_forked-1.3.0.dist-info/entry_points.txt'
adding 'pytest_forked-1.3.0.dist-info/top_level.txt'
adding 'pytest_forked-1.3.0.dist-info/RECORD'
removing build/bdist.linux-x86_64/wheel
metadata-validation create: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/.tox/metadata-validation
metadata-validation installdeps: twine
metadata-validation installed: bleach==3.3.0,certifi==2020.12.5,cffi==1.14.5,chardet==4.0.0,colorama==0.4.4,cryptography==3.4.7,docutils==0.17.1,idna==2.10,importlib-metadata==4.0.1,jeepney==0.6.0,keyring==23.0.1,packaging==20.9,pkginfo==1.7.0,pycparser==2.20,Pygments==2.9.0,pyparsing==2.4.7,readme-renderer==29.0,requests==2.25.1,requests-toolbelt==0.9.1,rfc3986==1.4.0,SecretStorage==3.3.1,six==1.15.0,tqdm==4.60.0,twine==3.4.1,urllib3==1.26.4,webencodings==0.5.1,zipp==3.4.1
metadata-validation run-test-pre: PYTHONHASHSEED='951081011'
metadata-validation run-test: commands[0] | twine check '/home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/dist/*'
Checking /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/dist/pytest_forked-1.3.0-py2.py3-none-any.whl: PASSED
Checking /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/dist/pytest-forked-1.3.0.tar.gz: PASSED
_________________________________________________________________________________ summary __________________________________________________________________________________
SKIPPED:  py27-pytest310: InterpreterNotFound: python2.7
SKIPPED:  py27-pytest46: InterpreterNotFound: python2.7
SKIPPED:  py27-pytest54: InterpreterNotFound: python2.7
SKIPPED:  py27-pytestlatest: InterpreterNotFound: python2.7
SKIPPED:  py35-pytest310: InterpreterNotFound: python3.5
SKIPPED:  py35-pytest46: InterpreterNotFound: python3.5
SKIPPED:  py35-pytest54: InterpreterNotFound: python3.5
SKIPPED:  py35-pytestlatest: InterpreterNotFound: python3.5
SKIPPED:  py36-pytest310: InterpreterNotFound: python3.6
SKIPPED:  py36-pytest46: InterpreterNotFound: python3.6
SKIPPED:  py36-pytest54: InterpreterNotFound: python3.6
SKIPPED:  py36-pytestlatest: InterpreterNotFound: python3.6
SKIPPED:  py37-pytest310: InterpreterNotFound: python3.7
SKIPPED:  py37-pytest46: InterpreterNotFound: python3.7
SKIPPED:  py37-pytest54: InterpreterNotFound: python3.7
SKIPPED:  py37-pytestlatest: InterpreterNotFound: python3.7
  py38-pytest310: commands succeeded
  py38-pytest46: commands succeeded
  py38-pytest54: commands succeeded
  py38-pytestlatest: commands succeeded
  flakes: commands succeeded
  build-dists: commands succeeded
  metadata-validation: commands succeeded
  congratulations :)

Usually on building my own rpm package when I see that tox is used as primary testiing framework I'm trying to use pytast because it has a bit shorter dependencies than tox.
Usulally I'm doing such test in env where I have installed Everything™️😎
I found that pytest is failing:

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-forked-1.3.0-3.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-forked-1.3.0-3.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0, configfile: tox.ini
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2
collected 10 items

testing/test_boxed.py ..Fxx.                                                                                                                                         [ 60%]
testing/test_xfail_behavior.py .F.F                                                                                                                                  [100%]

================================================================================= FAILURES =================================================================================
___________________________________________________________________ test_functional_boxed_capturing[no] ____________________________________________________________________

testdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-10/test_functional_boxed_capturing0')>, capmode = 'no'

    @needsfork
    @pytest.mark.parametrize("capmode", [
        "no",
        pytest.param("sys", marks=pytest.mark.xfail(reason="capture cleanup needed")),
        pytest.param("fd", marks=pytest.mark.xfail(reason="capture cleanup needed"))])
    def test_functional_boxed_capturing(testdir, capmode):
        p1 = testdir.makepyfile("""
            import os
            import sys
            def test_function():
                sys.stdout.write("hello\\n")
                sys.stderr.write("world\\n")
                os.kill(os.getpid(), 15)
        """)
        result = testdir.runpytest(p1, "--forked", "--capture=%s" % capmode)
>       result.stdout.fnmatch_lines("""
            *CRASHED*
            *stdout*
            hello
            *stderr*
            world
            *1 failed*
    """)
E       Failed: nomatch: '*CRASHED*'
E           and: '============================= test session starts =============================='
E           and: 'platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_functional_boxed_capturing0'
E           and: 'plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_functional_boxed_capturing.py F'
E           and: ''
E           and: '=================================== FAILURES ==================================='
E           and: '________________________________ test_function _________________________________'
E       fnmatch: '*CRASHED*'
E          with: ':-1: running the test CRASHED with signal 0'
E       nomatch: '*stdout*'
E           and: '------------------------------- captured stderr --------------------------------'
E           and: '/usr/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py:139: KeyError: <Function test_function>'
E           and: '=========================== short test summary info ============================'
E           and: 'FAILED test_functional_boxed_capturing.py::test_function'
E           and: '============================== 1 failed in 0.07s ==============================='
E       remains unmatched: '*stdout*'

/home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/testing/test_boxed.py:54: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_functional_boxed_capturing0
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2
collected 1 item

test_functional_boxed_capturing.py F

=================================== FAILURES ===================================
________________________________ test_function _________________________________
:-1: running the test CRASHED with signal 0
------------------------------- captured stderr --------------------------------
/usr/lib/python3.8/site-packages/flaky/flaky_pytest_plugin.py:139: KeyError: <Function test_function>
=========================== short test summary info ============================
FAILED test_functional_boxed_capturing.py::test_function
============================== 1 failed in 0.07s ===============================
_________________________________________________________________________ test_xfail[strict xpass] _________________________________________________________________________

is_crashing = False, is_strict = True, testdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-10/test_xfail1')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator

        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )

        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'

        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )

        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )

        test_module = testdir.makepyfile(
            """
            import os
            import signal

            import pytest

            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')

            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )

        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_xfail1'
E       fnmatch: 'plugins: forked*'
E          with: 'plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2'
E       exact match: 'collected 1 item'
E       nomatch: 'test_xfail.py F*'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 0'
E           and: '============================== 1 xfailed in 0.06s =============================='
E       remains unmatched: 'test_xfail.py F*'

/home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/testing/test_xfail_behavior.py:130: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_xfail1
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 0
============================== 1 xfailed in 0.06s ==============================
_______________________________________________________________________ test_xfail[non-strict xpass] _______________________________________________________________________

is_crashing = False, is_strict = False, testdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-10/test_xfail3')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator

        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )

        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'

        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )

        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )

        test_module = testdir.makepyfile(
            """
            import os
            import signal

            import pytest

            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')

            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )

        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_xfail3'
E       fnmatch: 'plugins: forked*'
E          with: 'plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2'
E       exact match: 'collected 1 item'
E       nomatch: 'test_xfail.py X*'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 0'
E           and: '============================== 1 xfailed in 0.06s =============================='
E       remains unmatched: 'test_xfail.py X*'

/home/tkloczko/rpmbuild/BUILD/pytest-forked-1.3.0/testing/test_xfail_behavior.py:130: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/pytest-of-tkloczko/pytest-10/test_xfail3
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, flaky-3.7.0, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 0
============================== 1 xfailed in 0.06s ==============================
========================================================================= short test summary info ==========================================================================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
FAILED testing/test_boxed.py::test_functional_boxed_capturing[no] - Failed: nomatch: '*CRASHED*'
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnmatch: '*==== test session starts ====*'
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed: fnmatch: '*==== test session starts ====*'
================================================================== 3 failed, 5 passed, 2 xfailed in 3.04s ==================================================================

Mac: Bad file descriptor

Not sure why this has suddenly changed, but tests are now erroring on Mac OSX runners. Worst part is that the process doesn't end after this error, so the CI doesn't fail until it timeouts after 15 minutes.

See test results here (I've also verified on a separate PR, so it's not related to the coverage version bump):
https://github.com/aio-libs/aiohttp-devtools/actions/runs/3783707241/jobs/6439936092

INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/_pytest/main.py", line 269, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/_pytest/main.py", line 323, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_hooks.py", line 265, in __call__
INTERNALERROR>     return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_manager.py", line 80, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_callers.py", line 60, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_result.py", line 60, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_callers.py", line 39, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/_pytest/main.py", line 348, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_hooks.py", line 265, in __call__
INTERNALERROR>     return self._hookexec(self.name, self.get_hookimpls(), kwargs, firstresult)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_manager.py", line 80, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_callers.py", line 60, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_result.py", line 60, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pluggy/_callers.py", line 39, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pytest_forked/__init__.py", line 51, in pytest_runtest_protocol
INTERNALERROR>     reports = forked_run_report(item)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/pytest_forked/__init__.py", line 73, in forked_run_report
INTERNALERROR>     ff = py.process.ForkedFunc(runforked)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/py/_process/forkedfunc.py", line 50, in __init__
INTERNALERROR>     self._child(nice_level, child_on_start, child_on_exit)
INTERNALERROR>   File "/Users/runner/hostedtoolcache/Python/3.10.9/x64/lib/python3.10/site-packages/py/_process/forkedfunc.py", line 74, in _child
INTERNALERROR>     stdout.close()
INTERNALERROR> OSError: [Errno 9] Bad file descriptor

Testing support for OpenIndiana (sunos5)

The tox.ini limits the testing support for linux and darwin only, but there are some other non-Windows platforms that are able to run tests properly. I just confirmed that on OpenIndiana (sunos5 platform) all tests pass (once flaky plugin is disabled - see #52), so please add sunos5 to list of supported platforms in tox.ini. Thank you.

Alternative to fork in Windows

Hi.

I understand that the forking functionality is not supported in windows.
Is there however any other way package capable of running the different test functions in different processes?

My code is something like:

# test_scripts.py

scripts = [ ... ]   #list of filepaths

@pytest.mark.parametrize('script', scripts)
def test_script_execution(script):
    runpy.run_path(script)

The scripts don't need any variables or state from this process (so I don't exactly need to fork), but they do need to be run in different processes, since they use packages containing global variables.

Is there any way to use pytest for this in windows?

crashing test hangs `pytest`

I'm observing the pytest process itself seemingly breaking as a result of a segmentation fault observed in a test.
Here is the relevant part of the log:

../python/tests/ctk/simulator/test_simple.py::test_battery Fatal Python error: Segmentation fault

Current thread 0x00007f412136e700 (most recent call first):
<no Python frame>

Thread 0x00007f4181d75780 (most recent call first):
  File "/home/user/.local/lib/python3.8/site-packages/py/_process/forkedfunc.py", line 76 in _child
  File "/home/user/.local/lib/python3.8/site-packages/py/_process/forkedfunc.py", line 50 in __init__
  File "/home/user/.local/lib/python3.8/site-packages/pytest_forked/__init__.py", line 67 in forked_run_report
  File "/home/user/.local/lib/python3.8/site-packages/pytest_forked/__init__.py", line 46 in pytest_runtest_protocol
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 348 in pytest_runtestloop
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 323 in _main
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 269 in wrap_session
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/config/__init__.py", line 162 in main
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/config/__init__.py", line 185 in console_main
  File "/home/user/.local/bin/py.test", line 8 in <module>
Fatal Python error: Aborted

Current thread 0x00007f412136e700 (most recent call first):
<no Python frame>

Thread 0x00007f4181d75780 (most recent call first):
  File "/home/user/.local/lib/python3.8/site-packages/py/_process/forkedfunc.py", line 76 in _child
  File "/home/user/.local/lib/python3.8/site-packages/py/_process/forkedfunc.py", line 50 in __init__
  File "/home/user/.local/lib/python3.8/site-packages/pytest_forked/__init__.py", line 67 in forked_run_report
  File "/home/user/.local/lib/python3.8/site-packages/pytest_forked/__init__.py", line 46 in pytest_runtest_protocol
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 348 in pytest_runtestloop
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 323 in _main
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 269 in wrap_session
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/main.py", line 316 in pytest_cmdline_main
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_callers.py", line 39 in _multicall
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_manager.py", line 80 in _hookexec
  File "/home/user/.local/lib/python3.8/site-packages/pluggy/_hooks.py", line 265 in __call__
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/config/__init__.py", line 162 in main
  File "/home/user/.local/lib/python3.8/site-packages/_pytest/config/__init__.py", line 185 in console_main
  File "/home/user/.local/bin/py.test", line 8 in <module>

Any idea how this might happen ? Is this a pytest bug ?

Integration with pytest-timeout plugin

Hi, when trying to add pytest-timeout to troubleshoot a hanging test, I get the following output:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/_pytest/main.py", line 203, in wrap_session
    session.exitstatus = doit(config, session) or 0
  File "/opt/conda/lib/python3.6/site-packages/_pytest/main.py", line 243, in _main
    config.hook.pytest_runtestloop(session=session)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/hooks.py", line 289, in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/manager.py", line 87, in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/manager.py", line 81, in <lambda>
    firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 203, in _multicall
    gen.send(outcome)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 80, in get_result
    raise ex[1].with_traceback(ex[2])
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 187, in _multicall
    res = hook_impl.function(*args)
  File "/opt/conda/lib/python3.6/site-packages/_pytest/main.py", line 264, in pytest_runtestloop
    item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/hooks.py", line 289, in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/manager.py", line 87, in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
  File "/opt/conda/lib/python3.6/site-packages/pluggy/manager.py", line 81, in <lambda>
    firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 208, in _multicall
    return outcome.get_result()
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 80, in get_result
    raise ex[1].with_traceback(ex[2])
  File "/opt/conda/lib/python3.6/site-packages/pluggy/callers.py", line 187, in _multicall
    res = hook_impl.function(*args)
  File "/opt/conda/lib/python3.6/site-packages/pytest_forked/__init__.py", line 35, in pytest_runtest_protocol
    reports = forked_run_report(item)
  File "/opt/conda/lib/python3.6/site-packages/pytest_forked/__init__.py", line 56, in forked_run_report
    result = ff.waitfinish()
  File "/opt/conda/lib/python3.6/site-packages/py/_process/forkedfunc.py", line 82, in waitfinish
    pid, systemstatus = waiter(self.pid, 0)
  File "/opt/conda/lib/python3.6/site-packages/pytest_timeout.py", line 140, in handler
    timeout_sigalrm(item, params.timeout)
  File "/opt/conda/lib/python3.6/site-packages/pytest_timeout.py", line 313, in timeout_sigalrm
    pytest.fail('Timeout >%ss' % timeout)
  File "/opt/conda/lib/python3.6/site-packages/_pytest/outcomes.py", line 113, in fail
    raise Failed(msg=msg, pytrace=pytrace)
Failed: Timeout >120.0s

Seems that the traceback includes only pytest-forked information.
Do you know if it's at all possible to get some traceback info from the wrapped test?
Thanks.

xdist master cant see workers' workeroutput when using --forked

Hi fellas.

I am building a tests profiler for collecting various statistics before/during test cases execution.
Since we use pytest-xdist, I decided its finally time to look into how new xdist hooks from newhooks.py work and try to get as much value out of them as possible, since doing shenanigans with locks and writing output to tmp files is pretty ugly.

I have finished my profiler MVP, and it was working nicely, until I tried to run it in our jenkins which uses --forked option.
setup
python-3.8.10
pytest-7.4.3
pytest-xdist-3.3.1
pytest-forked-1.6.0

ISSUE
Turned out no stats are being displayed when I use this option.
Inside pytest_testnodedown hook, where master node is collecting and aggregating collected statistics from all worker nodes, the node.workeroutput attribute is simply empty.

Is there some workaround for this, on how to properly send data from workers to master when using --forked? Am I just missing something?

Here are 2 most important hooks used for collecting and aggregating all the statistics.

    @pytest.hookimpl(hookwrapper=True, tryfirst=True)
    def pytest_sessionfinish(self):
        worker_id = get_xdist_worker_id(self)
        if worker_id != "master":
            # NOTE: Execnet can't serialize custom classes. Serialization has to be
            # implemented for everything we want to send in-between master and workers
            self.config.workeroutput[
                f"{worker_id}-stats"
            ] = self._profiling_stats.serialize()
        yield

    @pytest.hookimpl(trylast=True)
    def pytest_testnodedown(self, node):
        worker_id = node.workerinput["workerid"]
        node_stats = node.workeroutput[f"{worker_id}-stats"]
        node_stats = ProfilerNamespaceUnit.deserialize(node_stats)
        self._profiling_stats.merge_trees(node_stats)

Backward compatibility with pytest 3.1.3

Hello, I am using Jenkins ver. 2.176.1 and Python virtualenv. Since yesterday pytest-forked(1.1.1) is not compatible with older pytest (3.1.3). Run ends with pytest internal error.

+ py.test -n2 --junit-xml=build/test-reports/TEST-result.xml -s -v
============================= test session starts ==============================
platform linux -- Python 3.5.0, pytest-3.1.3, py-1.4.33, pluggy-0.4.0 -- /var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/bin/python3.5
cachedir: .cache
rootdir: /var/lib/jenkins/jobs/Autotests nfdumptools a exporter/workspace, inifile:
plugins: xdist-1.24.1, forked-1.1.1
gw0 I / gw1 I

[gw0] linux Python 3.5.0 cwd: /var/lib/jenkins/jobs/Autotests nfdumptools a exporter/workspace

[gw1] linux Python 3.5.0 cwd: /var/lib/jenkins/jobs/Autotests nfdumptools a exporter/workspace

[gw0] Python 3.5.0 (default, Mar  1 2017, 13:17:53)  -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-11)]

[gw1] Python 3.5.0 (default, Mar  1 2017, 13:17:53)  -- [GCC 4.8.5 20150623 (Red Hat 4.8.5-11)]
gw0 [141] / gw1 [141]

scheduling tests via LoadScheduling
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 105, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 141, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "<remote exec>", line 66, in pytest_runtestloop
INTERNALERROR>   File "<remote exec>", line 83, in run_one_test
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 254, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 265, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 254, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 265, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pytest_forked/__init__.py", line 41, in pytest_runtest_protocol
INTERNALERROR>     if item.config.getvalue("forked") or item.get_closest_marker("forked"):
INTERNALERROR> AttributeError: 'Function' object has no attribute 'get_closest_marker'
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 105, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 141, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "<remote exec>", line 66, in pytest_runtestloop
INTERNALERROR>   File "<remote exec>", line 83, in run_one_test
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 254, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 265, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 613, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 254, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 265, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/pytest_forked/__init__.py", line 41, in pytest_runtest_protocol
INTERNALERROR>     if item.config.getvalue("forked") or item.get_closest_marker("forked"):
INTERNALERROR> AttributeError: 'Function' object has no attribute 'get_closest_marker'
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 105, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/main.py", line 141, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/xdist/dsession.py", line 115, in pytest_runtestloop
INTERNALERROR>     self.loop_once()
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/xdist/dsession.py", line 138, in loop_once
INTERNALERROR>     call(**kwargs)
INTERNALERROR>   File "/var/lib/jenkins/shiningpanda/jobs/833158f0/virtualenvs/d41d8cd9/lib/python3.5/site-packages/xdist/dsession.py", line 180, in worker_workerfinished
INTERNALERROR>     assert not crashitem, (crashitem, node)
INTERNALERROR> AssertionError: ('test_case.py::TestCase::()::test_reboot', <WorkerController gw0>)
INTERNALERROR> assert not 'test_case.py::TestCase::()::test_reboot'

I guess it will be refused since pytest 3.1.3 is eol, but it took me some time to find out what is problem. So maybe backward compatibility check and info will be nice? Or maybe at least this post will help someone.

pytest 6.0.0rc1: TypeError: append() takes exactly one argument (5 given)

Hello, this is pytest-forked @ b29c386:

$ tox -e py38-pytestlatest --pre
py38-pytestlatest inst-nodeps: .../pytest-forked/.tox/.tmp/package/1/pytest-forked-1.2.1.dev4+gb29c386.tar.gz
py38-pytestlatest installed: attrs==19.3.0,iniconfig==1.0.0,more-itertools==8.4.0,packaging==20.4,pluggy==0.13.1,py==1.9.0,pycmd==1.2,pyparsing==3.0.0a2,pytest==6.0.0rc1,pytest-forked==1.2.1.dev4+gb29c386,setuptools-scm==4.1.2,six==1.15.0,toml==0.10.1
py38-pytestlatest run-test-pre: PYTHONHASHSEED='3934015748'
py38-pytestlatest run-test: commands[0] | pytest
============================= test session starts ==============================
platform linux -- Python 3.8.5, pytest-6.0.0rc1, py-1.9.0, pluggy-0.13.1
cachedir: .tox/py38-pytestlatest/.pytest_cache
rootdir: .../pytest-forked, configfile: tox.ini
plugins: forked-1.2.1.dev4+gb29c386
collected 10 items

testing/test_boxed.py EEExxE                                             [ 60%]
testing/test_xfail_behavior.py EEEE                                      [100%]

==================================== ERRORS ====================================
___________________ ERROR at setup of test_functional_boxed ____________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61f7eca0>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
_______________ ERROR at setup of test_functional_boxed_per_test _______________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61f7eaf0>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
____________ ERROR at setup of test_functional_boxed_capturing[no] _____________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61ea9e50>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
________________ ERROR at setup of test_is_not_boxed_by_default ________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61dded30>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
__________________ ERROR at setup of test_xfail[strict xfail] __________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61893040>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
__________________ ERROR at setup of test_xfail[strict xpass] __________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61f7ea60>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
________________ ERROR at setup of test_xfail[non-strict xfail] ________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61f0ca60>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
________________ ERROR at setup of test_xfail[non-strict xpass] ________________

cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7eff61e45670>
when = 'setup'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)

    @classmethod
    def from_call(
        cls,
        func: "Callable[[], _T]",
        when: "Literal['collect', 'setup', 'call', 'teardown']",
        reraise: "Optional[Union[Type[BaseException], Tuple[Type[BaseException], ...]]]" = None,
    ) -> "CallInfo[_T]":
        excinfo = None
        start = timing.time()
        precise_start = timing.perf_counter()
        try:
>           result = func()  # type: Optional[_T]

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:287: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:240: in <lambda>
    lambda: ihook(item=item, **kwds), when=when, reraise=reraise
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:141: in pytest_runtest_setup
    item.session._setupstate.prepare(item)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:428: in prepare
    raise e
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/runner.py:425: in prepare
    col.setup()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/python.py:1568: in setup
    self._request._fillfixtures()
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:568: in _fillfixtures
    item.funcargs[argname] = self.getfixturevalue(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:581: in getfixturevalue
    fixturedef = self._get_active_fixturedef(argname)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:601: in _get_active_fixturedef
    self._compute_fixture_value(fixturedef)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:683: in _compute_fixture_value
    fixturedef.execute(request=subrequest)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1053: in execute
    result = hook.pytest_fixture_setup(fixturedef=self, request=request)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/hooks.py:286: in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:93: in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/pluggy/manager.py:84: in <lambda>
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:1108: in pytest_fixture_setup
    result = call_fixture_func(fixturefunc, request, kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/fixtures.py:915: in call_fixture_func
    fixture_result = fixturefunc(**kwargs)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:387: in testdir
    return Testdir(request, tmpdir_factory)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pytester.py:585: in __init__
    self.tmpdir = tmpdir_factory.mktemp(name, numbered=True)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:120: in mktemp
    return py.path.local(self._tmppath_factory.mktemp(basename, numbered).resolve())
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:72: in mktemp
    basename = self._ensure_relative_to_basetemp(basename)
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:51: in _ensure_relative_to_basetemp
    if (self.getbasetemp() / basename).resolve().parent != self.getbasetemp():
.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/tmpdir.py:98: in getbasetemp
    basetemp = make_numbered_dir_with_cleanup(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

root = PosixPath('/tmp/pytest-of-churchyard'), prefix = 'pytest-', keep = 3
lock_timeout = 10800

    def make_numbered_dir_with_cleanup(
        root: Path, prefix: str, keep: int, lock_timeout: float
    ) -> Path:
        """creates a numbered dir with a cleanup lock and removes old ones"""
        e = None
        for i in range(10):
            try:
                p = make_numbered_dir(root, prefix)
                lock_path = create_cleanup_lock(p)
                register_cleanup_lock_removal(lock_path)
            except Exception as exc:
                e = exc
            else:
                consider_lock_dead_if_created_before = p.stat().st_mtime - lock_timeout
                # Register a cleanup for program exit
>               atexit.register(
                    cleanup_numbered_dir,
                    root,
                    prefix,
                    keep,
                    consider_lock_dead_if_created_before,
                )
E               TypeError: append() takes exactly one argument (5 given)

.tox/py38-pytestlatest/lib/python3.8/site-packages/_pytest/pathlib.py:354: TypeError
=========================== short test summary info ============================
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
========================= 2 xfailed, 8 errors in 1.91s =========================
ERROR: InvocationError for command .../pytest-forked/.tox/py38-pytestlatest/bin/pytest (exited with code 1)
___________________________________ summary ____________________________________
ERROR:   py38-pytestlatest: commands failed

Not sure if this is a pytest bug or pytest-forked needs to change something. The traceback mostly shows pytest code only :/

Display test name before running it

Hi, I am sometimes having a hard time troubleshooting failing tests stuck in an infinite loop.
The reason is that output is being printed only if test finished (fail or success), but if it is stuck, then it is very hard to identify which test is stuck.
Could you please suggest a pattern to print the test name to STDOUT / STDERR (on a CI server for example) before running any test ?
Thanks much.

Windows support

Great plugin! What would be required to add Windows support? Did anyone try it yet and has some insight on the obstacles?

AttributeError: module 'py' has no attribute 'process'

Hello!

I'm getting the following traceback when trying to use pytest-forked, but when looking through the output of dir(py) right before ff = py.process.ForkedFunc(runforked), py does indeed seem to have process, so I'm not sure what the issue could be. This occurs even if I remove the other plugins.

Test session starts (platform: linux, Python 3.11.6, pytest 7.4.3, pytest-sugar 0.9.7)
cachedir: .pytest_cache
Using --randomly-seed=1335576925
rootdir: /build/ndwvj7bzh8lycisww50x36vxv404551b-source
configfile: pyproject.toml
plugins: custom-exit-code-0.3.0, drop-dup-tests-0.3.0, forked-1.6.0, hy-1.0.0.0, ignore-1.0.0.0, lazy-fixture-0.6.3, order-1.1.0, randomly-3.13.0, repeat-0.9.2, sugar-0.9.7, xdist-3.3.1
8 workers [379 items]  m
scheduling tests via LoadScheduling
INTERNALERROR> def worker_internal_error(self, node, formatted_error):
INTERNALERROR>         """
INTERNALERROR>         pytest_internalerror() was called on the worker.
INTERNALERROR>
INTERNALERROR>         pytest_internalerror() arguments are an excinfo and an excrepr, which can't
INTERNALERROR>         be serialized, so we go with a poor man's solution of raising an exception
INTERNALERROR>         here ourselves using the formatted message.
INTERNALERROR>         """
INTERNALERROR>         self._active_nodes.remove(node)
INTERNALERROR>         try:
INTERNALERROR> >           assert False, formatted_error
INTERNALERROR> E           AssertionError: Traceback (most recent call last):
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 271, in wrap_session
INTERNALERROR> E                 session.exitstatus = doit(config, session) or 0
INTERNALERROR> E                                      ^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 325, in _main
INTERNALERROR> E                 config.hook.pytest_runtestloop(session=session)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 157, in pytest_runtestloop
INTERNALERROR> E                 self.run_one_test()
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 174, in run_one_test
INTERNALERROR> E                 self.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 51, in pytest_runtest_protocol
INTERNALERROR> E                 reports = forked_run_report(item)
INTERNALERROR> E                           ^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 73, in forked_run_report
INTERNALERROR> E                 ff = py.process.ForkedFunc(runforked)
INTERNALERROR> E                      ^^^^^^^^^^
INTERNALERROR> E             AttributeError: module 'py' has no attribute 'process'
INTERNALERROR> E           assert False
INTERNALERROR>
INTERNALERROR> /nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py:197: AssertionError
INTERNALERROR> def worker_internal_error(self, node, formatted_error):
INTERNALERROR>         """
INTERNALERROR>         pytest_internalerror() was called on the worker.
INTERNALERROR>
INTERNALERROR>         pytest_internalerror() arguments are an excinfo and an excrepr, which can't
INTERNALERROR>         be serialized, so we go with a poor man's solution of raising an exception
INTERNALERROR>         here ourselves using the formatted message.
INTERNALERROR>         """
INTERNALERROR>         self._active_nodes.remove(node)
INTERNALERROR>         try:
INTERNALERROR> >           assert False, formatted_error
INTERNALERROR> E           AssertionError: Traceback (most recent call last):
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 271, in wrap_session
INTERNALERROR> E                 session.exitstatus = doit(config, session) or 0
INTERNALERROR> E                                      ^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 325, in _main
INTERNALERROR> E                 config.hook.pytest_runtestloop(session=session)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 157, in pytest_runtestloop
INTERNALERROR> E                 self.run_one_test()
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 174, in run_one_test
INTERNALERROR> E                 self.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 51, in pytest_runtest_protocol
INTERNALERROR> E                 reports = forked_run_report(item)
INTERNALERROR> E                           ^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 73, in forked_run_report
INTERNALERROR> E                 ff = py.process.ForkedFunc(runforked)
INTERNALERROR> E                      ^^^^^^^^^^
INTERNALERROR> E             AttributeError: module 'py' has no attribute 'process'
INTERNALERROR> E           assert False
INTERNALERROR>
INTERNALERROR> /nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py:197: AssertionError
INTERNALERROR> def worker_internal_error(self, node, formatted_error):
INTERNALERROR>         """
INTERNALERROR>         pytest_internalerror() was called on the worker.
INTERNALERROR>
INTERNALERROR>         pytest_internalerror() arguments are an excinfo and an excrepr, which can't
INTERNALERROR>         be serialized, so we go with a poor man's solution of raising an exception
INTERNALERROR>         here ourselves using the formatted message.
INTERNALERROR>         """
INTERNALERROR>         self._active_nodes.remove(node)
INTERNALERROR>         try:
INTERNALERROR> >           assert False, formatted_error
INTERNALERROR> E           AssertionError: Traceback (most recent call last):
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 271, in wrap_session
INTERNALERROR> E                 session.exitstatus = doit(config, session) or 0
INTERNALERROR> E                                      ^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 325, in _main
INTERNALERROR> E                 config.hook.pytest_runtestloop(session=session)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 157, in pytest_runtestloop
INTERNALERROR> E                 self.run_one_test()
INTERNALERROR> E               File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/remote.py", line 174, in run_one_test
INTERNALERROR> E                 self.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR> E                 return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR> E                 return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR> E                 return outcome.get_result()
INTERNALERROR> E                        ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR> E                 raise exc.with_traceback(exc.__traceback__)
INTERNALERROR> E               File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR> E                 res = hook_impl.function(*args)
INTERNALERROR> E                       ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 51, in pytest_runtest_protocol
INTERNALERROR> E                 reports = forked_run_report(item)
INTERNALERROR> E                           ^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR> E               File "/nix/store/g6776z6nxpvqch54dgdh3r6hjc48kf4m-python3.11-pytest-forked-1.6.0/lib/python3.11/site-packages/pytest_forked/__init__.py", line 73, in forked_run_report
INTERNALERROR> E                 ff = py.process.ForkedFunc(runforked)
INTERNALERROR> E                      ^^^^^^^^^^
INTERNALERROR> E             AttributeError: module 'py' has no attribute 'process'
INTERNALERROR> E           assert False
INTERNALERROR>
INTERNALERROR> /nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py:197: AssertionError
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 271, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>                          ^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/nix/store/swm9wcfarf9kbswclnwdjvlszwpymnr4-python3.11-pytest-7.4.3/lib/python3.11/site-packages/_pytest/main.py", line 325, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 493, in __call__
INTERNALERROR>     return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_manager.py", line 115, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 152, in _multicall
INTERNALERROR>     return outcome.get_result()
INTERNALERROR>            ^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_result.py", line 114, in get_result
INTERNALERROR>     raise exc.with_traceback(exc.__traceback__)
INTERNALERROR>   File "/nix/store/vbsly1jzi08w10ppkai06jhgk60i5dc8-python3.11-pluggy-1.3.0/lib/python3.11/site-packages/pluggy/_callers.py", line 77, in _multicall
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>           ^^^^^^^^^^^^^^^^^^^^^^^^^
INTERNALERROR>   File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py", line 122, in pytest_runtestloop
INTERNALERROR>     self.loop_once()
INTERNALERROR>   File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py", line 145, in loop_once
INTERNALERROR>     call(**kwargs)
INTERNALERROR>   File "/nix/store/h4p9m98gf6kcpm0ipy4csw1lppg6hhvb-python3.11-pytest-xdist-3.3.1/lib/python3.11/site-packages/xdist/dsession.py", line 184, in worker_workerfinished
INTERNALERROR>     assert not crashitem, (crashitem, node)
INTERNALERROR> AssertionError: ('tests/test_with_cwd.hy::test_with_cwd', <WorkerController gw5>)
INTERNALERROR> assert not 'tests/test_with_cwd.hy::test_with_cwd'

Thank you kindly for the help!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.