Skip to content

test suite fails hypothesis is in the environment (a test dependency of pytest) #54

@apteryks

Description

@apteryks

Hi,

I'm trying to update this package to 1.3.0 on GNU Guix, but I'm encountering the following test failures:

============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1 -- /gnu/store/f8s95qc6dfhl0r45m70hczw5zip0xjxq-python-wrapper-3.8.2/bin/python
cachedir: .pytest_cache
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/.hypothesis/examples')
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source, configfile: tox.ini
plugins: hypothesis-5.4.1, forked-1.1.3
collecting ... collected 10 items

testing/test_boxed.py::test_functional_boxed PASSED                      [ 10%]
testing/test_boxed.py::test_functional_boxed_per_test PASSED             [ 20%]
testing/test_boxed.py::test_functional_boxed_capturing[no] PASSED        [ 30%]
testing/test_boxed.py::test_functional_boxed_capturing[sys] XFAIL (c...) [ 40%]
testing/test_boxed.py::test_functional_boxed_capturing[fd] XFAIL (ca...) [ 50%]
testing/test_boxed.py::test_is_not_boxed_by_default PASSED               [ 60%]
testing/test_xfail_behavior.py::test_xfail[strict xfail] FAILED          [ 70%]
testing/test_xfail_behavior.py::test_xfail[strict xpass] FAILED          [ 80%]
testing/test_xfail_behavior.py::test_xfail[non-strict xfail] FAILED      [ 90%]
testing/test_xfail_behavior.py::test_xfail[non-strict xpass] FAILED      [100%]

=================================== FAILURES ===================================
___________________________ test_xfail[strict xfail] ___________________________

is_crashing = True, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail0
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
___________________________ test_xfail[strict xpass] ___________________________

is_crashing = False, is_strict = True
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py F                                                          [100%]'
E           and: ''
E           and: '=================================== FAILURES ==================================='
E           and: '________________________________ test_function _________________________________'
E           and: '[XPASS(strict)] The process gets terminated'
E           and: '=========================== short test summary info ============================'
E           and: 'FAILED test_xfail.py::test_function'
E           and: '============================== 1 failed in 0.03s ==============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail1
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py F                                                          [100%]

=================================== FAILURES ===================================
________________________________ test_function _________________________________
[XPASS(strict)] The process gets terminated
=========================== short test summary info ============================
FAILED test_xfail.py::test_function
============================== 1 failed in 0.03s ===============================
_________________________ test_xfail[non-strict xfail] _________________________

is_crashing = True, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py x                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XFAIL test_xfail.py::test_function'
E           and: '  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15'
E           and: '============================== 1 xfailed in 0.03s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail2
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py x                                                          [100%]

=========================== short test summary info ============================
XFAIL test_xfail.py::test_function
  reason: The process gets terminated; pytest-forked reason: :-1: running the test CRASHED with signal 15
============================== 1 xfailed in 0.03s ==============================
_________________________ test_xfail[non-strict xpass] _________________________

is_crashing = False, is_strict = False
testdir = <Testdir local('/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3')>

    @pytest.mark.parametrize(
        ('is_crashing', 'is_strict'),
        (
            pytest.param(True, True, id='strict xfail'),
            pytest.param(False, True, id='strict xpass'),
            pytest.param(True, False, id='non-strict xfail'),
            pytest.param(False, False, id='non-strict xpass'),
        ),
    )
    def test_xfail(is_crashing, is_strict, testdir):
        """Test xfail/xpass/strict permutations."""
        # pylint: disable=possibly-unused-variable
        sig_num = signal.SIGTERM.numerator
    
        test_func_body = (
            'os.kill(os.getpid(), signal.SIGTERM)'
            if is_crashing
            else 'assert True'
        )
    
        if is_crashing:
            # marked xfailed and crashing, no matter strict or not
            expected_letter = 'x'  # XFAILED
            expected_lowercase = 'xfailed'
            expected_word = 'XFAIL'
        elif is_strict:
            # strict and not failing as expected should cause failure
            expected_letter = 'F'  # FAILED
            expected_lowercase = 'failed'
            expected_word = FAILED_WORD
        elif not is_strict:
            # non-strict and not failing as expected should cause xpass
            expected_letter = 'X'  # XPASS
            expected_lowercase = 'xpassed'
            expected_word = 'XPASS'
    
        session_start_title = '*==== test session starts ====*'
        loaded_pytest_plugins = 'plugins: forked*'
        collected_tests_num = 'collected 1 item'
        expected_progress = 'test_xfail.py {expected_letter!s}*'.format(**locals())
        failures_title = '*==== FAILURES ====*'
        failures_test_name = '*____ test_function ____*'
        failures_test_reason = '[XPASS(strict)] The process gets terminated'
        short_test_summary_title = '*==== short test summary info ====*'
        short_test_summary = (
            '{expected_word!s} test_xfail.py::test_function'.
            format(**locals())
        )
        if expected_lowercase == 'xpassed':
            # XPASS wouldn't have the crash message from
            # pytest-forked because the crash doesn't happen
            short_test_summary = ' '.join((
                short_test_summary, 'The process gets terminated',
            ))
        reason_string = (
            '  reason: The process gets terminated; '
            'pytest-forked reason: '
            '*:*: running the test CRASHED with signal {sig_num:d}'.
            format(**locals())
        )
        total_summary_line = (
            '*==== 1 {expected_lowercase!s} in 0.*s* ====*'.
            format(**locals())
        )
    
        expected_lines = (
            session_start_title,
            loaded_pytest_plugins,
            collected_tests_num,
            expected_progress,
        )
        if expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                failures_title,
                failures_test_name,
                failures_test_reason,
            )
        expected_lines += (
            short_test_summary_title,
            short_test_summary,
        )
        if expected_lowercase == 'xpassed' and expected_word == FAILED_WORD:
            # XPASS(strict)
            expected_lines += (
                reason_string,
            )
        expected_lines += (
            total_summary_line,
        )
    
        test_module = testdir.makepyfile(
            """
            import os
            import signal
    
            import pytest
    
            # The current implementation emits RuntimeWarning.
            pytestmark = pytest.mark.filterwarnings('ignore:pytest-forked xfail')
    
            @pytest.mark.xfail(
                reason='The process gets terminated',
                strict={is_strict!s},
            )
            @pytest.mark.forked
            def test_function():
                {test_func_body!s}
            """.
            format(**locals())
        )
    
        pytest_run_result = testdir.runpytest(test_module, '-ra')
>       pytest_run_result.stdout.fnmatch_lines(expected_lines)
E       Failed: fnmatch: '*==== test session starts ====*'
E          with: '============================= test session starts =============================='
E       nomatch: 'plugins: forked*'
E           and: 'platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1'
E           and: 'rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3'
E           and: 'plugins: hypothesis-5.4.1, forked-1.1.3'
E           and: 'collected 1 item'
E           and: ''
E           and: 'test_xfail.py X                                                          [100%]'
E           and: ''
E           and: '=========================== short test summary info ============================'
E           and: 'XPASS test_xfail.py::test_function The process gets terminated'
E           and: '============================== 1 xpassed in 0.04s =============================='
E       remains unmatched: 'plugins: forked*'

/tmp/guix-build-python-pytest-forked-1.3.0.drv-0/source/testing/test_xfail_behavior.py:130: Failed
----------------------------- Captured stdout call -----------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.2, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: /tmp/guix-build-python-pytest-forked-1.3.0.drv-0/pytest-of-nixbld/pytest-0/test_xfail3
plugins: hypothesis-5.4.1, forked-1.1.3
collected 1 item

test_xfail.py X                                                          [100%]

=========================== short test summary info ============================
XPASS test_xfail.py::test_function The process gets terminated
============================== 1 xpassed in 0.04s ==============================
=========================== short test summary info ============================
FAILED testing/test_xfail_behavior.py::test_xfail[strict xfail] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[strict xpass] - Failed: fnm...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xfail] - Failed:...
FAILED testing/test_xfail_behavior.py::test_xfail[non-strict xpass] - Failed:...
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[sys]
  capture cleanup needed
XFAIL testing/test_boxed.py::test_functional_boxed_capturing[fd]
  capture cleanup needed
==================== 4 failed, 4 passed, 2 xfailed in 1.62s ====================

The only direct dependency is pytest 6.2.4, but if you'd like to see the whole list of transitive dependencies used, here it is:

$ ./pre-inst-env guix refresh --list-transitive python-pytest-forked
python-pytest-forked@1.3.0 depends on the following 91 packages: python-appdirs@1.4.3 python-distlib@0.3.0 python-filelock@3.0.12 python-six-bootstrap@1.14.0 python-sortedcontainers@2.1.0 python-hypothesis@5.4.1 python-nose@1.3.7 python-lxml@4.4.2 python-elementpath@1.4.0 python-xmlschema@1.1.2 python-pytest-bootstrap@6.2.4 python-toml@0.10.2 python-iniconfig@1.1.1 python-atomicwrites@1.3.0 python-attrs-bootstrap@19.3.0 python-more-itertools@8.2.0 python-packaging-bootstrap@20.0 python-pluggy@0.13.1 python-wcwidth@0.1.8 gdbm@1.18.1 sqlite@3.31.1 libxrender@0.9.10 libxft@2.3.3 gperf@3.1 gzip@1.10 bzip2@1.0.8 unzip@6.0 font-dejavu@2.37 libpng@1.6.37 freetype@2.10.4 tar@1.32 net-base@5.3 util-linux@2.35.1 fontconfig@2.13.1 tcl@8.6.10 xtrans@1.4.0 xcb-proto@1.14 libunistring@0.9.10 gettext-minimal@0.20.1 libgpg-error@1.37 libgcrypt@1.8.5 libxml2@2.9.10 readline@8.0.4 ncurses@6.2 bash@5.0.16 tzdata@2019c expat@2.2.9 libffi@3.3 perl@5.30.2 openssl@1.1.1j python-minimal@3.8.2 python-minimal-wrapper@3.8.2 zlib@1.2.11 xz@5.2.4 libxslt@1.1.34 libpthread-stubs@0.4 libxau@1.0.9 libbsd@0.10.0 libxdmcp@1.1.3 libxcb@1.14 libx11@1.6.10 pkg-config@0.29.2 util-macros@1.19.2 xorgproto@2019.2 libxext@1.3.4 tk@8.6.10 python@3.8.2 python-wrapper@3.8.2 python-setuptools-scm@3.4.3 python-py@1.10.0 python-pytest@6.2.4 tar@1.32 gzip@1.10 bzip2@1.0.8 xz@5.2.4 file@5.38 diffutils@3.7 patch@2.7.6 findutils@4.7.0 gawk@5.0.1 sed@4.8 grep@3.4 coreutils@8.32 make@4.3 guile@3.0.2 bash-minimal@5.0.16 ld-wrapper@0 binutils@2.34 gcc@7.5.0 glibc@2.31 glibc-utf8-locales@2.31

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions