Mock Version: 5.0 Mock Version: 5.0 Mock Version: 5.0 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Wrote: /builddir/build/SRPMS/python-bidict-0.22.1-3.fc40.src.rpm Child return code was: 0 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.nL8IY0 + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf bidict-0.22.1 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/bidict-0.22.1.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd bidict-0.22.1 + rm -rf /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/mkdir -p /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/0001-Test-with-python3.12-beta.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + sed -r -i 's/"(pytest-(cov|benchmark))\b/# &/' pyproject.toml + sed -r -i /--benchmark-disable/d pytest.ini + rm -rf SPECPARTS + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.05Kye3 + umask 022 + cd /builddir/build/BUILD + cd bidict-0.22.1 + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + echo -n + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + RPM_TOXENV=py312 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires -x test Handling setuptools >= 40.9.0 from build-system.requires Requirement not satisfied: setuptools >= 40.9.0 Exiting dependency generation pass: build backend + cat /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires + rm -rfv '*.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-bidict-0.22.1-3.fc40.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.LEJesR + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf bidict-0.22.1 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/bidict-0.22.1.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd bidict-0.22.1 + rm -rf /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/mkdir -p /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/0001-Test-with-python3.12-beta.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + sed -r -i 's/"(pytest-(cov|benchmark))\b/# &/' pyproject.toml + sed -r -i /--benchmark-disable/d pytest.ini + rm -rf SPECPARTS + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.xatjM5 + umask 022 + cd /builddir/build/BUILD + cd bidict-0.22.1 + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + echo -n + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + RPM_TOXENV=py312 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires -x test Handling setuptools >= 40.9.0 from build-system.requires Requirement satisfied: setuptools >= 40.9.0 (installed: setuptools 68.2.2) No `name` configuration, performing automatic discovery No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running egg_info creating bidict.egg-info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt writing manifest file 'bidict.egg-info/SOURCES.txt' reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement not satisfied: wheel Exiting dependency generation pass: get_requires_for_build_wheel + cat /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires + rm -rfv '*.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-bidict-0.22.1-3.fc40.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.CEZoW5 + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf bidict-0.22.1 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/bidict-0.22.1.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd bidict-0.22.1 + rm -rf /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/mkdir -p /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/0001-Test-with-python3.12-beta.patch + sed -r -i 's/"(pytest-(cov|benchmark))\b/# &/' pyproject.toml + sed -r -i /--benchmark-disable/d pytest.ini + rm -rf SPECPARTS + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.XcRuoz + umask 022 + cd /builddir/build/BUILD + cd bidict-0.22.1 + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + echo -n + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + RPM_TOXENV=py312 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires -x test Handling setuptools >= 40.9.0 from build-system.requires Requirement satisfied: setuptools >= 40.9.0 (installed: setuptools 68.2.2) No `name` configuration, performing automatic discovery No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running egg_info creating bidict.egg-info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt writing manifest file 'bidict.egg-info/SOURCES.txt' reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.41.2) running dist_info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' creating '/builddir/build/BUILD/bidict-0.22.1/bidict-0.22.1.dist-info' Handling sphinx ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx ; extra == 'docs' Handling sphinx-copybutton ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx-copybutton ; extra == 'docs' Handling furo ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: furo ; extra == 'docs' Handling pre-commit ; extra == 'lint' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: pre-commit ; extra == 'lint' Handling hypothesis ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: hypothesis ; extra == 'test' Handling pytest ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: pytest ; extra == 'test' Handling pytest-xdist ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: pytest-xdist ; extra == 'test' Handling sortedcollections ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: sortedcollections ; extra == 'test' Handling sortedcontainers ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: sortedcontainers ; extra == 'test' Handling sphinx ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement not satisfied: sphinx ; extra == 'test' + cat /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires + rm -rfv bidict-0.22.1.dist-info/ removed 'bidict-0.22.1.dist-info/METADATA' removed 'bidict-0.22.1.dist-info/top_level.txt' removed 'bidict-0.22.1.dist-info/LICENSE' removed directory 'bidict-0.22.1.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-bidict-0.22.1-3.fc40.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.hsVkWb + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf bidict-0.22.1 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/bidict-0.22.1.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd bidict-0.22.1 + rm -rf /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/mkdir -p /builddir/build/BUILD/bidict-0.22.1-SPECPARTS + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + /usr/lib/rpm/rpmuncompress /builddir/build/SOURCES/0001-Test-with-python3.12-beta.patch + /usr/bin/patch -p1 -s --fuzz=0 --no-backup-if-mismatch -f + sed -r -i 's/"(pytest-(cov|benchmark))\b/# &/' pyproject.toml + sed -r -i /--benchmark-disable/d pytest.ini + rm -rf SPECPARTS + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.YCguWo + umask 022 + cd /builddir/build/BUILD + cd bidict-0.22.1 + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + echo -n + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + RPM_TOXENV=py312 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires -x test Handling setuptools >= 40.9.0 from build-system.requires Requirement satisfied: setuptools >= 40.9.0 (installed: setuptools 68.2.2) No `name` configuration, performing automatic discovery No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running egg_info creating bidict.egg-info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt writing manifest file 'bidict.egg-info/SOURCES.txt' reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.41.2) running dist_info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' creating '/builddir/build/BUILD/bidict-0.22.1/bidict-0.22.1.dist-info' Handling sphinx ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx ; extra == 'docs' Handling sphinx-copybutton ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx-copybutton ; extra == 'docs' Handling furo ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: furo ; extra == 'docs' Handling pre-commit ; extra == 'lint' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: pre-commit ; extra == 'lint' Handling hypothesis ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: hypothesis ; extra == 'test' (installed: hypothesis 6.82.0) Handling pytest ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: pytest ; extra == 'test' (installed: pytest 7.3.2) Handling pytest-xdist ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: pytest-xdist ; extra == 'test' (installed: pytest-xdist 3.5.0) Handling sortedcollections ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sortedcollections ; extra == 'test' (installed: sortedcollections 2.1.0) Handling sortedcontainers ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sortedcontainers ; extra == 'test' (installed: sortedcontainers 2.4.0) Handling sphinx ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sphinx ; extra == 'test' (installed: sphinx 7.2.6) + cat /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires + rm -rfv bidict-0.22.1.dist-info/ removed 'bidict-0.22.1.dist-info/METADATA' removed 'bidict-0.22.1.dist-info/top_level.txt' removed 'bidict-0.22.1.dist-info/LICENSE' removed directory 'bidict-0.22.1.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-bidict-0.22.1-3.fc40.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'], chrootPath='/var/lib/mock/f40-build-2426561-60598/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False error: %changelog not in descending chronological order Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1706227200 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.tMGlos + umask 022 + cd /builddir/build/BUILD + cd bidict-0.22.1 + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(tomli) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + echo -n + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + RPM_TOXENV=py312 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires -x test Handling setuptools >= 40.9.0 from build-system.requires Requirement satisfied: setuptools >= 40.9.0 (installed: setuptools 68.2.2) No `name` configuration, performing automatic discovery No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running egg_info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' Handling wheel from get_requires_for_build_wheel Requirement satisfied: wheel (installed: wheel 0.41.2) running dist_info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' creating '/builddir/build/BUILD/bidict-0.22.1/bidict-0.22.1.dist-info' Handling sphinx ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx ; extra == 'docs' Handling sphinx-copybutton ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: sphinx-copybutton ; extra == 'docs' Handling furo ; extra == 'docs' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: furo ; extra == 'docs' Handling pre-commit ; extra == 'lint' from hook generated metadata: Requires-Dist (bidict) Ignoring alien requirement: pre-commit ; extra == 'lint' Handling hypothesis ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: hypothesis ; extra == 'test' (installed: hypothesis 6.82.0) Handling pytest ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: pytest ; extra == 'test' (installed: pytest 7.3.2) Handling pytest-xdist ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: pytest-xdist ; extra == 'test' (installed: pytest-xdist 3.5.0) Handling sortedcollections ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sortedcollections ; extra == 'test' (installed: sortedcollections 2.1.0) Handling sortedcontainers ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sortedcontainers ; extra == 'test' (installed: sortedcontainers 2.4.0) Handling sphinx ; extra == 'test' from hook generated metadata: Requires-Dist (bidict) Requirement satisfied: sphinx ; extra == 'test' (installed: sphinx 7.2.6) + cat /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-buildrequires + rm -rfv bidict-0.22.1.dist-info/ removed 'bidict-0.22.1.dist-info/METADATA' removed 'bidict-0.22.1.dist-info/top_level.txt' removed 'bidict-0.22.1.dist-info/LICENSE' removed directory 'bidict-0.22.1.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.GgEg1A + umask 022 + cd /builddir/build/BUILD + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd bidict-0.22.1 + mkdir -p /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_wheel.py /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir Processing /builddir/build/BUILD/bidict-0.22.1 Preparing metadata (pyproject.toml): started Running command Preparing metadata (pyproject.toml) No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running dist_info creating /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info writing /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/PKG-INFO writing dependency_links to /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/dependency_links.txt writing requirements to /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/requires.txt writing top-level names to /builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/top_level.txt writing manifest file '/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/SOURCES.txt' reading manifest file '/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file '/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict.egg-info/SOURCES.txt' creating '/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-modern-metadata-v0hqqc0e/bidict-0.22.1.dist-info' Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: bidict Building wheel for bidict (pyproject.toml): started Running command Building wheel for bidict (pyproject.toml) No `packages` or `py_modules` configuration, performing automatic discovery. `flat-layout` detected -- analysing . discovered packages -- ['bidict'] No `name` configuration, performing automatic discovery Single module/package detected, name: bidict running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/bidict copying bidict/_exc.py -> build/lib/bidict copying bidict/__init__.py -> build/lib/bidict copying bidict/_frozenbidict.py -> build/lib/bidict copying bidict/_abc.py -> build/lib/bidict copying bidict/_base.py -> build/lib/bidict copying bidict/_named.py -> build/lib/bidict copying bidict/_frozenordered.py -> build/lib/bidict copying bidict/_orderedbase.py -> build/lib/bidict copying bidict/_bidict.py -> build/lib/bidict copying bidict/_orderedbidict.py -> build/lib/bidict copying bidict/_typing.py -> build/lib/bidict copying bidict/metadata.py -> build/lib/bidict copying bidict/_dup.py -> build/lib/bidict copying bidict/_iter.py -> build/lib/bidict running egg_info writing bidict.egg-info/PKG-INFO writing dependency_links to bidict.egg-info/dependency_links.txt writing requirements to bidict.egg-info/requires.txt writing top-level names to bidict.egg-info/top_level.txt reading manifest file 'bidict.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files matching '*' found under directory 'docs' adding license file 'LICENSE' writing manifest file 'bidict.egg-info/SOURCES.txt' copying bidict/py.typed -> build/lib/bidict installing to build/bdist.linux-riscv64/wheel running install running install_lib creating build/bdist.linux-riscv64 creating build/bdist.linux-riscv64/wheel creating build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_exc.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/__init__.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_frozenbidict.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_abc.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_base.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_named.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_frozenordered.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/py.typed -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_orderedbase.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_bidict.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_orderedbidict.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_typing.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/metadata.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_dup.py -> build/bdist.linux-riscv64/wheel/bidict copying build/lib/bidict/_iter.py -> build/bdist.linux-riscv64/wheel/bidict running install_egg_info Copying bidict.egg-info to build/bdist.linux-riscv64/wheel/bidict-0.22.1-py3.12.egg-info running install_scripts creating build/bdist.linux-riscv64/wheel/bidict-0.22.1.dist-info/WHEEL creating '/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir/pip-wheel-8o07s_af/.tmp-1kx58_tp/bidict-0.22.1-py3-none-any.whl' and adding 'build/bdist.linux-riscv64/wheel' to it adding 'bidict/__init__.py' adding 'bidict/_abc.py' adding 'bidict/_base.py' adding 'bidict/_bidict.py' adding 'bidict/_dup.py' adding 'bidict/_exc.py' adding 'bidict/_frozenbidict.py' adding 'bidict/_frozenordered.py' adding 'bidict/_iter.py' adding 'bidict/_named.py' adding 'bidict/_orderedbase.py' adding 'bidict/_orderedbidict.py' adding 'bidict/_typing.py' adding 'bidict/metadata.py' adding 'bidict/py.typed' adding 'bidict-0.22.1.dist-info/LICENSE' adding 'bidict-0.22.1.dist-info/METADATA' adding 'bidict-0.22.1.dist-info/WHEEL' adding 'bidict-0.22.1.dist-info/top_level.txt' adding 'bidict-0.22.1.dist-info/RECORD' removing build/bdist.linux-riscv64/wheel Building wheel for bidict (pyproject.toml): finished with status 'done' Created wheel for bidict: filename=bidict-0.22.1-py3-none-any.whl size=35865 sha256=8cd2585b3820d0a0b35bcd3e55f9db36fac4cae7b9dd4775572ad5ee6a650894 Stored in directory: /builddir/.cache/pip/wheels/3a/53/16/fe2b91587b0bdf22b71d98562c19c85c6937e8286af1272b47 Successfully built bidict + RPM_EC=0 ++ jobs -p + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.ujslZV + umask 022 + cd /builddir/build/BUILD + '[' /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch '!=' / ']' + rm -rf /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch ++ dirname /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch + mkdir -p /builddir/build/BUILDROOT + mkdir /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd bidict-0.22.1 ++ ls /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir/bidict-0.22.1-py3-none-any.whl ++ xargs basename --multiple ++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/' + specifier=bidict==0.22.1 + '[' -z bidict==0.22.1 ']' + TMPDIR=/builddir/build/BUILD/bidict-0.22.1/.pyproject-builddir + /usr/bin/python3 -m pip install --root /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch --prefix /usr --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir bidict==0.22.1 Using pip 23.3.2 from /usr/lib/python3.12/site-packages/pip (python 3.12) Looking in links: /builddir/build/BUILD/bidict-0.22.1/pyproject-wheeldir Processing ./pyproject-wheeldir/bidict-0.22.1-py3-none-any.whl Installing collected packages: bidict Successfully installed bidict-0.22.1 + '[' -d /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/bin ']' + rm -f /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-ghost-distinfo + site_dirs=() + '[' -d /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages ']' + site_dirs+=("/usr/lib/python3.12/site-packages") + '[' /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib64/python3.12/site-packages '!=' /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages ']' + '[' -d /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib64/python3.12/site-packages ']' + for site_dir in ${site_dirs[@]} + for distinfo in /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch$site_dir/*.dist-info + echo '%ghost /usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info' + sed -i s/pip/rpm/ /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/INSTALLER + PYTHONPATH=/usr/lib/rpm/redhat + /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch --record /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/RECORD --output /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-record + rm -fv /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/RECORD removed '/builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/RECORD' + rm -fv /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/REQUESTED removed '/builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages/bidict-0.22.1.dist-info/REQUESTED' ++ wc -l /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-ghost-distinfo ++ cut -f1 '-d ' + lines=1 + '[' 1 -ne 1 ']' + RPM_PERCENTAGES_COUNT=2 + /usr/bin/python3 /usr/lib/rpm/redhat/pyproject_save_files.py --output-files /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-files --output-modules /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-modules --buildroot /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch --sitelib /usr/lib/python3.12/site-packages --sitearch /usr/lib64/python3.12/site-packages --python-version 3.12 --pyproject-record /builddir/build/BUILD/python-bidict-0.22.1-3.fc40.noarch-pyproject-record --prefix /usr bidict + /usr/bin/find-debuginfo -j8 --strict-build-id -m -i --build-id-seed 0.22.1-3.fc40 --unique-debug-suffix -0.22.1-3.fc40.noarch --unique-debug-src-base python-bidict-0.22.1-3.fc40.noarch --run-dwz --dwz-low-mem-die-limit 10000000 --dwz-max-die-limit 50000000 -S debugsourcefiles.list /builddir/build/BUILD/bidict-0.22.1 find-debuginfo: starting Extracting debug info from 0 files Creating .debug symlinks for symlinks to ELF files find: ‘debug’: No such file or directory find-debuginfo: done + /usr/lib/rpm/check-buildroot + /usr/lib/rpm/redhat/brp-ldconfig + /usr/lib/rpm/brp-compress + /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip + /usr/lib/rpm/brp-strip-static-archive /usr/bin/strip + /usr/lib/rpm/check-rpaths + /usr/lib/rpm/redhat/brp-mangle-shebangs + /usr/lib/rpm/brp-remove-la-files + env /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0 -j8 Bytecompiling .py files below /builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12 using python3.12 + /usr/lib/rpm/redhat/brp-python-hardlink Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.pRfbYT + umask 022 + cd /builddir/build/BUILD + CFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Werror=implicit-function-declaration -Werror=implicit-int -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -fexceptions -g -grecord-gcc-switches -pipe -Wall -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + RUSTFLAGS='-Copt-level=3 -Cdebuginfo=2 -Ccodegen-units=1 -Cstrip=none -Cforce-frame-pointers=yes -Clink-arg=-Wl,-z,relro -Clink-arg=-Wl,-z,now --cap-lints=warn' + export RUSTFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd bidict-0.22.1 + rm -vf tests/test_microbenchmarks.py removed 'tests/test_microbenchmarks.py' + PYTHONPATH=/builddir/build/BUILDROOT/python-bidict-0.22.1-3.fc40.noarch/usr/lib/python3.12/site-packages + PYTHONDONTWRITEBYTECODE=1 + PYTEST_XDIST_AUTO_NUM_WORKERS=8 + /usr/bin/python3 ./run_tests.py ============================= test session starts ============================== platform linux -- Python 3.12.0, pytest-7.3.2, pluggy-1.3.0 -- /usr/bin/python3 cachedir: .pytest_cache hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/builddir/build/BUILD/bidict-0.22.1/.hypothesis/examples') rootdir: /builddir/build/BUILD/bidict-0.22.1 configfile: pytest.ini testpaths: bidict, tests plugins: xdist-3.5.0, hypothesis-6.82.0 created: 8/8 workers 8 workers [82 items] scheduling tests via LoadScheduling tests/test_class_relationships.py::test_issubclass_bimap[frozenbidict] bidict/__init__.py::bidict tests/test_class_relationships.py::test_issubclass_bimap[OrderedBidictBase] tests/test_class_relationships.py::test_issubclass_bimap[BidictBase] tests/test_class_relationships.py::test_issubclass_bimap[OrderedBidict] [gw0] [ 1%] PASSED bidict/__init__.py::bidict [gw1] [ 2%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[BidictBase] [gw2] [ 3%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[OrderedBidictBase] [gw4] [ 4%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[OrderedBidict] [gw3] [ 6%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[frozenbidict] tests/test_class_relationships.py::test_issubclass_bimap[MyNamedBidict] tests/test_class_relationships.py::test_not_issubclass_not_bimap[object] tests/test_class_relationships.py::test_not_issubclass_not_bimap[OrderedDict] [gw6] [ 7%] PASSED tests/test_class_relationships.py::test_not_issubclass_not_bimap[OrderedDict] [gw7] [ 8%] PASSED tests/test_class_relationships.py::test_not_issubclass_not_bimap[object] [gw5] [ 9%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[MyNamedBidict] tests/test_bidict.txt::test_bidict.txt tests/test_class_relationships.py::test_issubclass_bimap[MutableBidict] tests/test_class_relationships.py::test_issubclass_bimap[bidict] [gw1] [ 10%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[MutableBidict] tests/test_class_relationships.py::test_not_issubclass_not_bimap[int] tests/test_class_relationships.py::test_issubclass_bimap[FrozenOrderedBidict] tests/test_class_relationships.py::test_issubclass_mapping[BidictBase] tests/test_class_relationships.py::test_not_issubclass_not_bimap[dict] [gw2] [ 12%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[bidict] tests/test_class_relationships.py::test_issubclass_mapping[frozenbidict] tests/test_class_relationships.py::test_issubclass_bimap[AbstractBimap] [gw3] [ 13%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[FrozenOrderedBidict] [gw6] [ 14%] PASSED tests/test_class_relationships.py::test_not_issubclass_not_bimap[int] [gw1] [ 15%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[frozenbidict] [gw5] [ 17%] PASSED tests/test_class_relationships.py::test_not_issubclass_not_bimap[dict] [gw4] [ 18%] PASSED tests/test_class_relationships.py::test_issubclass_bimap[AbstractBimap] tests/test_class_relationships.py::test_hashable_not_mutable[frozenbidict] [gw7] [ 19%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[BidictBase] tests/test_class_relationships.py::test_issubclass_mapping[OrderedBidict] [gw2] [ 20%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[OrderedBidict] tests/test_class_relationships.py::test_issubclass_hashable[frozenbidict] [gw7] [ 21%] PASSED tests/test_class_relationships.py::test_issubclass_hashable[frozenbidict] [gw3] [ 23%] PASSED tests/test_class_relationships.py::test_hashable_not_mutable[frozenbidict] tests/test_class_relationships.py::test_issubclass_mutable_and_mutable_bidirectional_mapping[OrderedBidict] tests/test_class_relationships.py::test_issubclass_internal [gw5] [ 24%] PASSED tests/test_class_relationships.py::test_issubclass_internal [gw6] [ 25%] PASSED tests/test_class_relationships.py::test_issubclass_mutable_and_mutable_bidirectional_mapping[OrderedBidict] tests/test_class_relationships.py::test_ordered_reversible[OrderedBidict] [gw4] [ 26%] PASSED tests/test_class_relationships.py::test_ordered_reversible[OrderedBidict] tests/test_class_relationships.py::test_issubclass_mapping[FrozenOrderedBidict] [gw1] [ 28%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[FrozenOrderedBidict] tests/test_class_relationships.py::test_bimap_inverse_notimplemented [gw1] [ 29%] PASSED tests/test_class_relationships.py::test_bimap_inverse_notimplemented tests/test_class_relationships.py::test_hashable_not_mutable[FrozenOrderedBidict] tests/test_class_relationships.py::test_issubclass_hashable[FrozenOrderedBidict] [gw0] [ 30%] PASSED tests/test_bidict.txt::test_bidict.txt tests/test_class_relationships.py::test_issubclass_mapping[MutableBidict] [gw3] [ 31%] PASSED tests/test_class_relationships.py::test_hashable_not_mutable[FrozenOrderedBidict] tests/test_class_relationships.py::test_issubclass_mutable_and_mutable_bidirectional_mapping[bidict] [gw0] [ 32%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[MutableBidict] tests/test_class_relationships.py::test_issubclass_mapping[OrderedBidictBase] [gw7] [ 34%] PASSED tests/test_class_relationships.py::test_issubclass_hashable[FrozenOrderedBidict] [gw2] [ 35%] PASSED tests/test_class_relationships.py::test_issubclass_mutable_and_mutable_bidirectional_mapping[bidict] tests/test_class_relationships.py::test_abstract_bimap_init_fails [gw5] [ 36%] PASSED tests/test_class_relationships.py::test_abstract_bimap_init_fails [gw0] [ 37%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[OrderedBidictBase] tests/test_class_relationships.py::test_bidict_bases_init_succeed[BidictBase] tests/test_class_relationships.py::test_bidict_bases_init_succeed[MutableBidict] [gw3] [ 39%] PASSED tests/test_class_relationships.py::test_bidict_bases_init_succeed[MutableBidict] tests/test_class_relationships.py::test_bidict_reversible_matches_dict_reversible tests/test_class_relationships.py::test_bidict_bases_init_succeed[OrderedBidictBase] tests/test_class_relationships.py::test_bidict_reversible [gw1] [ 40%] PASSED tests/test_class_relationships.py::test_bidict_bases_init_succeed[BidictBase] tests/test_class_relationships.py::test_ordered_reversible[FrozenOrderedBidict] [gw4] [ 41%] PASSED tests/test_class_relationships.py::test_ordered_reversible[FrozenOrderedBidict] tests/test_class_relationships.py::test_issubclass_namedbidict [gw6] [ 42%] PASSED tests/test_class_relationships.py::test_issubclass_namedbidict tests/test_class_relationships.py::test_issubclass_mapping[bidict] [gw2] [ 43%] PASSED tests/test_class_relationships.py::test_bidict_reversible_matches_dict_reversible [gw5] [ 45%] PASSED tests/test_class_relationships.py::test_bidict_reversible tests/property_tests/test_properties.py::test_unequal_to_non_mapping [gw0] [ 46%] PASSED tests/test_class_relationships.py::test_issubclass_mapping[bidict] [gw7] [ 47%] PASSED tests/test_class_relationships.py::test_bidict_bases_init_succeed[OrderedBidictBase] tests/property_tests/test_properties.py::test_unequal_to_mapping_with_different_items tests/test_namedbidict.txt::test_namedbidict.txt tests/property_tests/test_properties.py::test_equal_hashables_have_same_hash tests/property_tests/test_properties.py::test_eq_correctly_defers_to_eq_of_non_mapping tests/test_metadata.py::test_metadata tests/property_tests/test_properties.py::test_equal_to_mapping_with_same_items [gw6] [ 48%] PASSED tests/test_metadata.py::test_metadata tests/test_orderedbidict.txt::test_orderedbidict.txt tests/property_tests/test_properties.py::test_unequal_order_sensitive_non_mapping [gw4] [ 50%] PASSED tests/test_namedbidict.txt::test_namedbidict.txt tests/property_tests/test_properties.py::test_unequal_order_sensitive_same_items_different_order [gw1] [ 51%] PASSED tests/test_orderedbidict.txt::test_orderedbidict.txt tests/property_tests/test_properties.py::test_equals_order_sensitive_same_items [gw4] [ 52%] FAILED tests/property_tests/test_properties.py::test_unequal_order_sensitive_same_items_different_order tests/property_tests/test_properties.py::test_cleared_bidicts_have_no_items [gw0] [ 53%] FAILED tests/property_tests/test_properties.py::test_equal_hashables_have_same_hash tests/property_tests/test_properties.py::test_setitem_with_dup_val_raises [gw7] [ 54%] FAILED tests/property_tests/test_properties.py::test_eq_correctly_defers_to_eq_of_non_mapping tests/property_tests/test_properties.py::test_put_with_dup_key_raises [gw3] [ 56%] FAILED tests/property_tests/test_properties.py::test_unequal_to_non_mapping tests/property_tests/test_properties.py::test_equals_matches_equals_order_sensitive [gw1] [ 57%] FAILED tests/property_tests/test_properties.py::test_equals_order_sensitive_same_items tests/property_tests/test_properties.py::test_consistency_after_method_call [gw1] [ 58%] FAILED tests/property_tests/test_properties.py::test_consistency_after_method_call tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_name [gw0] [ 59%] FAILED tests/property_tests/test_properties.py::test_setitem_with_dup_val_raises [gw2] [ 60%] FAILED tests/property_tests/test_properties.py::test_unequal_to_mapping_with_different_items tests/property_tests/test_properties.py::test_bidict_iter tests/property_tests/test_properties.py::test_merge_operators [gw0] [ 62%] FAILED tests/property_tests/test_properties.py::test_bidict_iter tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_base_type [gw7] [ 63%] FAILED tests/property_tests/test_properties.py::test_put_with_dup_key_raises tests/property_tests/test_properties.py::test_bidict_reversed [gw2] [ 64%] FAILED tests/property_tests/test_properties.py::test_merge_operators tests/property_tests/test_properties.py::test_namedbidict [gw1] [ 65%] FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_name tests/property_tests/test_properties.py::test_namedbidict_raises_on_same_keyname_as_valname [gw7] [ 67%] FAILED tests/property_tests/test_properties.py::test_bidict_reversed tests/property_tests/test_properties.py::test_orderedbidict_nodes_freed_on_zero_refcount [gw5] [ 68%] FAILED tests/property_tests/test_properties.py::test_equal_to_mapping_with_same_items tests/property_tests/test_properties.py::test_setitem_with_dup_key_val_raises [gw6] [ 69%] PASSED tests/property_tests/test_properties.py::test_unequal_order_sensitive_non_mapping tests/property_tests/test_properties.py::test_bijectivity [gw5] [ 70%] FAILED tests/property_tests/test_properties.py::test_setitem_with_dup_key_val_raises tests/property_tests/test_properties.py::test_inverse_readonly [gw0] [ 71%] FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_base_type tests/property_tests/test_properties.py::test_bidicts_freed_on_zero_refcount [gw5] [ 73%] FAILED tests/property_tests/test_properties.py::test_inverse_readonly [gw2] [ 74%] FAILED tests/property_tests/test_properties.py::test_namedbidict tests/property_tests/test_properties.py::test_orderedbidict_nodes_consistent [gw0] [ 75%] PASSED tests/property_tests/test_properties.py::test_bidicts_freed_on_zero_refcount tests/property_tests/test_properties.py::test_pickle_dynamically_generated_inverse_bidict tests/property_tests/test_properties.py::test_pickle_orderedbi_whose_order_disagrees_w_fwdm [gw5] [ 76%] PASSED tests/property_tests/test_properties.py::test_pickle_orderedbi_whose_order_disagrees_w_fwdm tests/property_tests/test_properties.py::test_deepcopy [gw0] [ 78%] PASSED tests/property_tests/test_properties.py::test_pickle_dynamically_generated_inverse_bidict [gw6] [ 79%] FAILED tests/property_tests/test_properties.py::test_bijectivity tests/property_tests/test_properties.py::test_iteritems_raises_on_too_many_args [gw0] [ 80%] PASSED tests/property_tests/test_properties.py::test_iteritems_raises_on_too_many_args tests/property_tests/test_properties.py::test_inverted_pairs tests/property_tests/test_properties.py::test_pickle [gw2] [ 81%] FAILED tests/property_tests/test_properties.py::test_orderedbidict_nodes_consistent tests/property_tests/test_properties.py::test_copy [gw1] [ 82%] FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_same_keyname_as_valname tests/property_tests/test_properties.py::test_abc_slots [gw1] [ 84%] PASSED tests/property_tests/test_properties.py::test_abc_slots [gw5] [ 85%] FAILED tests/property_tests/test_properties.py::test_deepcopy tests/property_tests/test_properties.py::test_iteritems [gw4] [ 86%] FAILED tests/property_tests/test_properties.py::test_cleared_bidicts_have_no_items tests/property_tests/test_properties.py::test_putall_same_as_put_for_each_item [gw6] [ 87%] FAILED tests/property_tests/test_properties.py::test_pickle tests/property_tests/test_properties.py::test_inverted_bidict [gw4] [ 89%] FAILED tests/property_tests/test_properties.py::test_putall_same_as_put_for_each_item [gw6] [ 90%] FAILED tests/property_tests/test_properties.py::test_inverted_bidict [gw3] [ 91%] PASSED tests/property_tests/test_properties.py::test_equals_matches_equals_order_sensitive tests/property_tests/test_properties.py::test_frozenbidicts_hashable [gw0] [ 92%] PASSED tests/property_tests/test_properties.py::test_inverted_pairs [gw7] [ 93%] PASSED tests/property_tests/test_properties.py::test_orderedbidict_nodes_freed_on_zero_refcount tests/property_tests/test_properties.py::test_views tests/property_tests/test_properties.py::test_inv_aliases_inverse [gw3] [ 95%] FAILED tests/property_tests/test_properties.py::test_frozenbidicts_hashable [gw0] [ 96%] FAILED tests/property_tests/test_properties.py::test_views [gw2] [ 97%] PASSED tests/property_tests/test_properties.py::test_copy [gw7] [ 98%] PASSED tests/property_tests/test_properties.py::test_inv_aliases_inverse [gw5] [100%] FAILED tests/property_tests/test_properties.py::test_iteritems =================================== FAILURES =================================== ___________ test_unequal_order_sensitive_same_items_different_order ____________ [gw4] linux -- Python 3.12.0 /usr/bin/python3 @given(st.OBI_AND_OMAP_FROM_SAME_ITEMS_DIFF_ORDER) > def test_unequal_order_sensitive_same_items_different_order(ob_and_om: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 0 valid examples in 1.44 seconds (4 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:143: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(65235886415951600301298080390148183826) to this test or run pytest with --hypothesis-seed=65235886415951600301298080390148183826 to reproduce this failure. _____________________ test_equal_hashables_have_same_hash ______________________ [gw0] linux -- Python 3.12.0 /usr/bin/python3 @given(st.HBI_AND_HMAP_FROM_SAME_ND_ITEMS) > def test_equal_hashables_have_same_hash(hashable_bidict_and_mapping: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 2 valid examples in 1.32 seconds (0 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:97: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(21148195842123013807207045227143275851) to this test or run pytest with --hypothesis-seed=21148195842123013807207045227143275851 to reproduce this failure. ________________ test_eq_correctly_defers_to_eq_of_non_mapping _________________ [gw7] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS) > def test_eq_correctly_defers_to_eq_of_non_mapping(bi: Bi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 7 valid examples in 1.00 seconds (0 invalid ones and 3 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:64: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(271086215009457453747629706479455151084) to this test or run pytest with --hypothesis-seed=271086215009457453747629706479455151084 to reproduce this failure. _________________________ test_unequal_to_non_mapping __________________________ [gw3] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS, st.NON_MAPPINGS) > def test_unequal_to_non_mapping(bi: Bi, not_a_mapping: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 8 valid examples in 1.90 seconds (0 invalid ones and 3 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:55: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(49524555224252635154278951155044747065) to this test or run pytest with --hypothesis-seed=49524555224252635154278951155044747065 to reproduce this failure. ____________________ test_equals_order_sensitive_same_items ____________________ [gw1] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BI_AND_MAP_FROM_SAME_ND_ITEMS) > def test_equals_order_sensitive_same_items(bi_and_map_from_same_items: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 7 valid examples in 1.90 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:131: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(17197781016158119263969789029854555645) to this test or run pytest with --hypothesis-seed=17197781016158119263969789029854555645 to reproduce this failure. ______________________ test_consistency_after_method_call ______________________ [gw1] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BI_AND_CMPDICT_FROM_SAME_ITEMS, st.DATA) > def test_consistency_after_method_call(bi_and_cmp_dict: t.Any, data: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 4 valid examples in 1.02 seconds (0 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:253: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(204763423560876857230356373631983943912) to this test or run pytest with --hypothesis-seed=204763423560876857230356373631983943912 to reproduce this failure. _______________________ test_setitem_with_dup_val_raises _______________________ [gw0] linux -- Python 3.12.0 /usr/bin/python3 @given(st.MUTABLE_BIDICTS, st.DIFF_ATOMS, st.RANDOMS) > def test_setitem_with_dup_val_raises(bi: MBi, new_key: t.Any, rand: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 1 valid examples in 1.26 seconds (9 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:198: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(312625507257891688368427051444887049358) to this test or run pytest with --hypothesis-seed=312625507257891688368427051444887049358 to reproduce this failure. _________________ test_unequal_to_mapping_with_different_items _________________ [gw2] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BI_AND_MAP_FROM_DIFF_ITEMS) > def test_unequal_to_mapping_with_different_items(bi_and_map_from_diff_items: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 5 valid examples in 1.03 seconds (0 invalid ones and 1 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:71: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(33913675016619061669455547723175088875) to this test or run pytest with --hypothesis-seed=33913675016619061669455547723175088875 to reproduce this failure. _______________________________ test_bidict_iter _______________________________ [gw0] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BI_AND_MAP_FROM_SAME_ND_ITEMS) > def test_bidict_iter(bi_and_mapping: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 1 valid examples in 1.00 seconds (0 invalid ones and 1 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:340: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(176010979684566982645136718556140296983) to this test or run pytest with --hypothesis-seed=176010979684566982645136718556140296983 to reproduce this failure. _________________________ test_put_with_dup_key_raises _________________________ [gw7] linux -- Python 3.12.0 /usr/bin/python3 @given(st.MUTABLE_BIDICTS, st.DIFF_ATOMS, st.RANDOMS) > def test_put_with_dup_key_raises(bi: MBi, new_val: t.Any, rand: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 0 valid examples in 2.00 seconds (3 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:224: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(79397479688638377115546050874533035027) to this test or run pytest with --hypothesis-seed=79397479688638377115546050874533035027 to reproduce this failure. _____________________________ test_merge_operators _____________________________ [gw2] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS, st.NON_BI_MAPPINGS) > def test_merge_operators(bi: Bi, mapping: t.Mapping[t.Any, t.Any]) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 5 valid examples in 1.37 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:171: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(24013640858724655534283036567479561676) to this test or run pytest with --hypothesis-seed=24013640858724655534283036567479561676 to reproduce this failure. ___________________ test_namedbidict_raises_on_invalid_name ____________________ [gw1] linux -- Python 3.12.0 /usr/bin/python3 @given(st.NAMEDBIDICT_NAMES_SOME_INVALID) > def test_namedbidict_raises_on_invalid_name(names: tuple[str, str, str]) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 2 valid examples in 1.02 seconds (0 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:362: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(263470724988251938900191863932918052416) to this test or run pytest with --hypothesis-seed=263470724988251938900191863932918052416 to reproduce this failure. _____________________________ test_bidict_reversed _____________________________ [gw7] linux -- Python 3.12.0 /usr/bin/python3 @given(st.RBI_AND_RMAP_FROM_SAME_ND_ITEMS) > def test_bidict_reversed(rb_and_rd: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 9 valid examples in 1.39 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:347: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(104544325833008652871509885006359829262) to this test or run pytest with --hypothesis-seed=104544325833008652871509885006359829262 to reproduce this failure. ____________________ test_equal_to_mapping_with_same_items _____________________ [gw5] linux -- Python 3.12.0 /usr/bin/python3 self = data = ConjectureData(INTERESTING, 51 bytes, frozen) def _execute_once_for_engine(self, data): """Wrapper around ``execute_once`` that intercepts test failure exceptions and single-test control exceptions, and turns them into appropriate method calls to `data` instead. This allows the engine to assume that any exception other than ``StopTest`` must be a fatal error, and should stop the entire engine. """ try: trace = frozenset() if ( self.failed_normally and not self.failed_due_to_deadline and Phase.shrink in self.settings.phases and Phase.explain in self.settings.phases and sys.gettrace() is None and not PYPY ): # pragma: no cover # This is in fact covered by our *non-coverage* tests, but due to the # settrace() contention *not* by our coverage tests. Ah well. tracer = Tracer() try: sys.settrace(tracer.trace) result = self.execute_once(data) if data.status == Status.VALID: self.explain_traces[None].add(frozenset(tracer.branches)) finally: sys.settrace(None) trace = frozenset(tracer.branches) else: > result = self.execute_once(data) /usr/lib/python3.12/site-packages/hypothesis/core.py:850: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(INTERESTING, 51 bytes, frozen), print_example = False is_final = False, expected_failure = None, example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. > result = self.test_runner(data, run) /usr/lib/python3.12/site-packages/hypothesis/core.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 51 bytes, frozen) function = .run at 0xffffff9b3cf920> def default_new_style_executor(data, function): > return function(data) /usr/lib/python3.12/site-packages/hypothesis/executors.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 51 bytes, frozen) def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) > return test(*args, **kwargs) /usr/lib/python3.12/site-packages/hypothesis/core.py:785: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ bi_and_map_from_same_items = (UserBidict({True: True, 5: -20422, 62: 31436, None: 101}), UserBidict({True: True, 5: -20422, 62: 31436, None: 101})) @given(st.BI_AND_MAP_FROM_SAME_ND_ITEMS) > def test_equal_to_mapping_with_same_items(bi_and_map_from_same_items: t.Any) -> None: tests/property_tests/test_properties.py:79: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ((UserBidict({True: True, 5: -20422, 62: 31436, None: 101}), UserBidict({True: True, 5: -20422, 62: 31436, None: 101})),) kwargs = {}, initial_draws = 1, start = 8065794.658922437, result = None finish = 8065795.505161563, internal_draw_time = 0 runtime = datetime.timedelta(microseconds=846239) current_deadline = datetime.timedelta(microseconds=250000) @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: > raise DeadlineExceeded(runtime, self.settings.deadline) E hypothesis.errors.DeadlineExceeded: Test took 846.24ms, which exceeds the deadline of 200.00ms /usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded The above exception was the direct cause of the following exception: @given(st.BI_AND_MAP_FROM_SAME_ND_ITEMS) > def test_equal_to_mapping_with_same_items(bi_and_map_from_same_items: t.Any) -> None: tests/property_tests/test_properties.py:79: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(VALID, 51 bytes, frozen), print_example = True is_final = True expected_failure = (DeadlineExceeded('Test took 846.24ms, which exceeds the deadline of 200.00ms'), 'args = ((UserBidict({True: True, 5: ...hich exceeds the deadline of 200.00ms\n\n/usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded\n') example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. result = self.test_runner(data, run) # If a failure was expected, it should have been raised already, so # instead raise an appropriate diagnostic error. if expected_failure is not None: exception, traceback = expected_failure if ( isinstance(exception, DeadlineExceeded) and self.__test_runtime is not None ): report( "Unreliable test timings! On an initial run, this " "test took %.2fms, which exceeded the deadline of " "%.2fms, but on a subsequent run it took %.2f ms, " "which did not. If you expect this sort of " "variability in your test timings, consider turning " "deadlines off for this test by setting deadline=None." % ( exception.runtime.total_seconds() * 1000, self.settings.deadline.total_seconds() * 1000, self.__test_runtime.total_seconds() * 1000, ) ) else: report("Failed to reproduce exception. Expected: \n" + traceback) > raise Flaky( f"Hypothesis {text_repr} produces unreliable results: " "Falsified on the first call but did not on a subsequent one" ) from exception E hypothesis.errors.Flaky: Hypothesis test_equal_to_mapping_with_same_items(bi_and_map_from_same_items=(UserBidict({True: True, 5: -20422, 62: 31436, None: 101}), E UserBidict({True: True, 5: -20422, 62: 31436, None: 101}))) produces unreliable results: Falsified on the first call but did not on a subsequent one E Falsifying example: test_equal_to_mapping_with_same_items( E bi_and_map_from_same_items=(UserBidict({True: True, 5: -20422, 62: 31436, None: 101}), E UserBidict({True: True, 5: -20422, 62: 31436, None: 101})), E ) E Unreliable test timings! On an initial run, this test took 846.24ms, which exceeded the deadline of 200.00ms, but on a subsequent run it took 1.73 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None. /usr/lib/python3.12/site-packages/hypothesis/core.py:814: Flaky _____________________ test_setitem_with_dup_key_val_raises _____________________ [gw5] linux -- Python 3.12.0 /usr/bin/python3 @given(st.MUTABLE_BIDICTS, st.RANDOMS) > def test_setitem_with_dup_key_val_raises(bi: MBi, rand: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 1 valid examples in 1.24 seconds (5 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:210: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(236400581480145334596676989623777436362) to this test or run pytest with --hypothesis-seed=236400581480145334596676989623777436362 to reproduce this failure. _________________ test_namedbidict_raises_on_invalid_base_type _________________ [gw0] linux -- Python 3.12.0 /usr/bin/python3 @given(st.NAMEDBIDICT_NAMES_ALL_VALID, st.NON_BI_MAPPING_TYPES) > def test_namedbidict_raises_on_invalid_base_type(names: tuple[str, str, str], invalid_base_type: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 3 valid examples in 1.03 seconds (5 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:378: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(136616722681363598883754607707851443180) to this test or run pytest with --hypothesis-seed=136616722681363598883754607707851443180 to reproduce this failure. ____________________________ test_inverse_readonly _____________________________ [gw5] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS) > def test_inverse_readonly(bi: Bi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 1 valid examples in 1.18 seconds (0 invalid ones and 1 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:464: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(39014731271597668274874235735901831060) to this test or run pytest with --hypothesis-seed=39014731271597668274874235735901831060 to reproduce this failure. _______________________________ test_namedbidict _______________________________ [gw2] linux -- Python 3.12.0 /usr/bin/python3 @given(st.NAMEDBIDICTS) > def test_namedbidict(nb: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 2 valid examples in 1.35 seconds (5 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:385: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(278157263048670283730292794298838922591) to this test or run pytest with --hypothesis-seed=278157263048670283730292794298838922591 to reproduce this failure. _______________________________ test_bijectivity _______________________________ [gw6] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS) > def test_bijectivity(bi: Bi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 4 valid examples in 1.83 seconds (0 invalid ones and 1 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:236: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(336085193076892757391259510717477361702) to this test or run pytest with --hypothesis-seed=336085193076892757391259510717477361702 to reproduce this failure. _____________________ test_orderedbidict_nodes_consistent ______________________ [gw2] linux -- Python 3.12.0 /usr/bin/python3 @given(st.ORDERED_BIDICTS) > def test_orderedbidict_nodes_consistent(ob: OBi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 7 valid examples in 1.47 seconds (0 invalid ones and 5 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:440: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(41778782918639323888087747574008297025) to this test or run pytest with --hypothesis-seed=41778782918639323888087747574008297025 to reproduce this failure. ______________ test_namedbidict_raises_on_same_keyname_as_valname ______________ [gw1] linux -- Python 3.12.0 /usr/bin/python3 @given(st.NAMEDBIDICT_NAMES_ALL_VALID) > def test_namedbidict_raises_on_same_keyname_as_valname(names: tuple[str, str, str]) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 3 valid examples in 1.52 seconds (5 invalid ones and 0 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:370: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(33902595958426757993708839095112155186) to this test or run pytest with --hypothesis-seed=33902595958426757993708839095112155186 to reproduce this failure. ________________________________ test_deepcopy _________________________________ [gw5] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS) > def test_deepcopy(bi: Bi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 3 valid examples in 1.35 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:544: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(184943002461601432438107430142920051923) to this test or run pytest with --hypothesis-seed=184943002461601432438107430142920051923 to reproduce this failure. ______________________ test_cleared_bidicts_have_no_items ______________________ [gw4] linux -- Python 3.12.0 /usr/bin/python3 self = data = ConjectureData(INTERESTING, 18 bytes, frozen) def _execute_once_for_engine(self, data): """Wrapper around ``execute_once`` that intercepts test failure exceptions and single-test control exceptions, and turns them into appropriate method calls to `data` instead. This allows the engine to assume that any exception other than ``StopTest`` must be a fatal error, and should stop the entire engine. """ try: trace = frozenset() if ( self.failed_normally and not self.failed_due_to_deadline and Phase.shrink in self.settings.phases and Phase.explain in self.settings.phases and sys.gettrace() is None and not PYPY ): # pragma: no cover # This is in fact covered by our *non-coverage* tests, but due to the # settrace() contention *not* by our coverage tests. Ah well. tracer = Tracer() try: sys.settrace(tracer.trace) result = self.execute_once(data) if data.status == Status.VALID: self.explain_traces[None].add(frozenset(tracer.branches)) finally: sys.settrace(None) trace = frozenset(tracer.branches) else: > result = self.execute_once(data) /usr/lib/python3.12/site-packages/hypothesis/core.py:850: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(INTERESTING, 18 bytes, frozen), print_example = False is_final = False, expected_failure = None, example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. > result = self.test_runner(data, run) /usr/lib/python3.12/site-packages/hypothesis/core.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 18 bytes, frozen) function = .run at 0xffffff99c827a0> def default_new_style_executor(data, function): > return function(data) /usr/lib/python3.12/site-packages/hypothesis/executors.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 18 bytes, frozen) def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) > return test(*args, **kwargs) /usr/lib/python3.12/site-packages/hypothesis/core.py:785: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ bi = NamedBidict() @given(st.MUTABLE_BIDICTS) > def test_cleared_bidicts_have_no_items(bi: MBi) -> None: tests/property_tests/test_properties.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (NamedBidict(),), kwargs = {}, initial_draws = 1 start = 8065832.620073373, result = None, finish = 8065832.900930308 internal_draw_time = 0, runtime = datetime.timedelta(microseconds=280857) current_deadline = datetime.timedelta(microseconds=250000) @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: > raise DeadlineExceeded(runtime, self.settings.deadline) E hypothesis.errors.DeadlineExceeded: Test took 280.86ms, which exceeds the deadline of 200.00ms /usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded The above exception was the direct cause of the following exception: @given(st.MUTABLE_BIDICTS) > def test_cleared_bidicts_have_no_items(bi: MBi) -> None: tests/property_tests/test_properties.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(VALID, 18 bytes, frozen), print_example = True is_final = True expected_failure = (DeadlineExceeded('Test took 280.86ms, which exceeds the deadline of 200.00ms'), 'args = (NamedBidict(),), kwargs = {}...hich exceeds the deadline of 200.00ms\n\n/usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded\n') example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. result = self.test_runner(data, run) # If a failure was expected, it should have been raised already, so # instead raise an appropriate diagnostic error. if expected_failure is not None: exception, traceback = expected_failure if ( isinstance(exception, DeadlineExceeded) and self.__test_runtime is not None ): report( "Unreliable test timings! On an initial run, this " "test took %.2fms, which exceeded the deadline of " "%.2fms, but on a subsequent run it took %.2f ms, " "which did not. If you expect this sort of " "variability in your test timings, consider turning " "deadlines off for this test by setting deadline=None." % ( exception.runtime.total_seconds() * 1000, self.settings.deadline.total_seconds() * 1000, self.__test_runtime.total_seconds() * 1000, ) ) else: report("Failed to reproduce exception. Expected: \n" + traceback) > raise Flaky( f"Hypothesis {text_repr} produces unreliable results: " "Falsified on the first call but did not on a subsequent one" ) from exception E hypothesis.errors.Flaky: Hypothesis test_cleared_bidicts_have_no_items(bi=NamedBidict({None: None, True: False})) produces unreliable results: Falsified on the first call but did not on a subsequent one E Falsifying example: test_cleared_bidicts_have_no_items( E bi=(lambda i: i[0](i[1]))( E (tests.property_tests._types.NamedBidict, PrettyIter( E [(None, None), (True, False)], E )), E ), E ) E Unreliable test timings! On an initial run, this test took 280.86ms, which exceeded the deadline of 200.00ms, but on a subsequent run it took 0.17 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None. /usr/lib/python3.12/site-packages/hypothesis/core.py:814: Flaky _________________________________ test_pickle __________________________________ [gw6] linux -- Python 3.12.0 /usr/bin/python3 self = data = ConjectureData(INTERESTING, 3 bytes, frozen) def _execute_once_for_engine(self, data): """Wrapper around ``execute_once`` that intercepts test failure exceptions and single-test control exceptions, and turns them into appropriate method calls to `data` instead. This allows the engine to assume that any exception other than ``StopTest`` must be a fatal error, and should stop the entire engine. """ try: trace = frozenset() if ( self.failed_normally and not self.failed_due_to_deadline and Phase.shrink in self.settings.phases and Phase.explain in self.settings.phases and sys.gettrace() is None and not PYPY ): # pragma: no cover # This is in fact covered by our *non-coverage* tests, but due to the # settrace() contention *not* by our coverage tests. Ah well. tracer = Tracer() try: sys.settrace(tracer.trace) result = self.execute_once(data) if data.status == Status.VALID: self.explain_traces[None].add(frozenset(tracer.branches)) finally: sys.settrace(None) trace = frozenset(tracer.branches) else: > result = self.execute_once(data) /usr/lib/python3.12/site-packages/hypothesis/core.py:850: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(INTERESTING, 3 bytes, frozen), print_example = False is_final = False, expected_failure = None, example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. > result = self.test_runner(data, run) /usr/lib/python3.12/site-packages/hypothesis/core.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 3 bytes, frozen) function = .run at 0xffffffb79204a0> def default_new_style_executor(data, function): > return function(data) /usr/lib/python3.12/site-packages/hypothesis/executors.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 3 bytes, frozen) def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) > return test(*args, **kwargs) /usr/lib/python3.12/site-packages/hypothesis/core.py:785: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ bi = frozenbidict() @given(st.BIDICTS) > @example(BIDICT_TYPE_WHOSE_MODULE_HAS_REF_TO_INV_CLS({1: 'one'}).inverse) tests/property_tests/test_properties.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (frozenbidict(),), kwargs = {}, initial_draws = 1 start = 8065847.831160733, result = None, finish = 8065849.10746511 internal_draw_time = 0 runtime = datetime.timedelta(seconds=1, microseconds=276304) current_deadline = datetime.timedelta(microseconds=250000) @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: > raise DeadlineExceeded(runtime, self.settings.deadline) E hypothesis.errors.DeadlineExceeded: Test took 1276.30ms, which exceeds the deadline of 200.00ms /usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded The above exception was the direct cause of the following exception: @given(st.BIDICTS) > @example(BIDICT_TYPE_WHOSE_MODULE_HAS_REF_TO_INV_CLS({1: 'one'}).inverse) tests/property_tests/test_properties.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(VALID, 3 bytes, frozen), print_example = True is_final = True expected_failure = (DeadlineExceeded('Test took 1276.30ms, which exceeds the deadline of 200.00ms'), 'args = (frozenbidict(),), kwargs = ...hich exceeds the deadline of 200.00ms\n\n/usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded\n') example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. result = self.test_runner(data, run) # If a failure was expected, it should have been raised already, so # instead raise an appropriate diagnostic error. if expected_failure is not None: exception, traceback = expected_failure if ( isinstance(exception, DeadlineExceeded) and self.__test_runtime is not None ): report( "Unreliable test timings! On an initial run, this " "test took %.2fms, which exceeded the deadline of " "%.2fms, but on a subsequent run it took %.2f ms, " "which did not. If you expect this sort of " "variability in your test timings, consider turning " "deadlines off for this test by setting deadline=None." % ( exception.runtime.total_seconds() * 1000, self.settings.deadline.total_seconds() * 1000, self.__test_runtime.total_seconds() * 1000, ) ) else: report("Failed to reproduce exception. Expected: \n" + traceback) > raise Flaky( f"Hypothesis {text_repr} produces unreliable results: " "Falsified on the first call but did not on a subsequent one" ) from exception E hypothesis.errors.Flaky: Hypothesis test_pickle(bi=frozenbidict()) produces unreliable results: Falsified on the first call but did not on a subsequent one E Falsifying example: test_pickle( E bi=operator.attrgetter('inverse')( E (lambda i: i[0](i[1]))((bidict.frozenbidict, PrettyIter([]))), E ), E ) E Unreliable test timings! On an initial run, this test took 1276.30ms, which exceeded the deadline of 200.00ms, but on a subsequent run it took 1.77 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None. /usr/lib/python3.12/site-packages/hypothesis/core.py:814: Flaky ____________________ test_putall_same_as_put_for_each_item _____________________ [gw4] linux -- Python 3.12.0 /usr/bin/python3 > ??? tests/property_tests/test_properties.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/site-packages/hypothesis/core.py:1043: in _raise_to_user raise the_error_hypothesis_found _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (OrderedBidict([(1, 1), (2, 2)]), [(1, 2), (1, 1)], OnDup(key=OD.RAISE, val=OD.RAISE, kv=OD.DROP_OLD)) kwargs = {}, initial_draws = 0, start = 8065860.615092923, result = None finish = 8065861.157213687, internal_draw_time = 0 runtime = datetime.timedelta(microseconds=542121) current_deadline = timedelta(milliseconds=200) @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: > raise DeadlineExceeded(runtime, self.settings.deadline) E hypothesis.errors.DeadlineExceeded: Test took 542.12ms, which exceeds the deadline of 200.00ms E Falsifying explicit example: test_putall_same_as_put_for_each_item( E bi=OrderedBidict([(1, 1), (2, 2)]), E items=[(1, 2), (1, 1)], E on_dup=OnDup(key=OD.RAISE, val=OD.RAISE, kv=OD.DROP_OLD), E ) /usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded _____________________________ test_inverted_bidict _____________________________ [gw6] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BI_AND_MAP_FROM_SAME_ND_ITEMS) > def test_inverted_bidict(bi_and_mapping: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 3 valid examples in 1.00 seconds (0 invalid ones and 1 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:589: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(190787274156778112295342005860076524660) to this test or run pytest with --hypothesis-seed=190787274156778112295342005860076524660 to reproduce this failure. _________________________ test_frozenbidicts_hashable __________________________ [gw3] linux -- Python 3.12.0 /usr/bin/python3 @given(st.FROZEN_BIDICTS) > def test_frozenbidicts_hashable(bi: Bi) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 9 valid examples in 1.01 seconds (0 invalid ones and 3 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:354: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(113832198810611545482846198235623663668) to this test or run pytest with --hypothesis-seed=113832198810611545482846198235623663668 to reproduce this failure. __________________________________ test_views __________________________________ [gw0] linux -- Python 3.12.0 /usr/bin/python3 @given(st.BIDICTS, st.DATA) > def test_views(bi: t.Any, data: t.Any) -> None: E hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 3 valid examples in 1.04 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g. max_size or max_leaves parameters). E See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test. tests/property_tests/test_properties.py:603: FailedHealthCheck ---------------------------------- Hypothesis ---------------------------------- You can add @seed(168328981997675239523557895253007869989) to this test or run pytest with --hypothesis-seed=168328981997675239523557895253007869989 to reproduce this failure. ________________________________ test_iteritems ________________________________ [gw5] linux -- Python 3.12.0 /usr/bin/python3 self = data = ConjectureData(INTERESTING, 27 bytes, frozen) def _execute_once_for_engine(self, data): """Wrapper around ``execute_once`` that intercepts test failure exceptions and single-test control exceptions, and turns them into appropriate method calls to `data` instead. This allows the engine to assume that any exception other than ``StopTest`` must be a fatal error, and should stop the entire engine. """ try: trace = frozenset() if ( self.failed_normally and not self.failed_due_to_deadline and Phase.shrink in self.settings.phases and Phase.explain in self.settings.phases and sys.gettrace() is None and not PYPY ): # pragma: no cover # This is in fact covered by our *non-coverage* tests, but due to the # settrace() contention *not* by our coverage tests. Ah well. tracer = Tracer() try: sys.settrace(tracer.trace) result = self.execute_once(data) if data.status == Status.VALID: self.explain_traces[None].add(frozenset(tracer.branches)) finally: sys.settrace(None) trace = frozenset(tracer.branches) else: > result = self.execute_once(data) /usr/lib/python3.12/site-packages/hypothesis/core.py:850: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(INTERESTING, 27 bytes, frozen), print_example = False is_final = False, expected_failure = None, example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. > result = self.test_runner(data, run) /usr/lib/python3.12/site-packages/hypothesis/core.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 27 bytes, frozen) function = .run at 0xffffff9b3ce520> def default_new_style_executor(data, function): > return function(data) /usr/lib/python3.12/site-packages/hypothesis/executors.py:47: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ data = ConjectureData(INTERESTING, 27 bytes, frozen) def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) > return test(*args, **kwargs) /usr/lib/python3.12/site-packages/hypothesis/core.py:785: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ arg0 = iter([(True, None), (None, False), (None, True), (True, None), (False, None)]) kw = {} @given(st.I_PAIRS, st.DICTS_KW_PAIRS) > def test_iteritems(arg0: t.Any, kw: t.Any) -> None: tests/property_tests/test_properties.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (iter([(True, None), (None, False), (None, True), (True, None), (False, None)]), {}) kwargs = {}, initial_draws = 2, start = 8065890.214482345, result = None finish = 8065890.502438619, internal_draw_time = 0 runtime = datetime.timedelta(microseconds=287956) current_deadline = datetime.timedelta(microseconds=250000) @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: > raise DeadlineExceeded(runtime, self.settings.deadline) E hypothesis.errors.DeadlineExceeded: Test took 287.96ms, which exceeds the deadline of 200.00ms /usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded The above exception was the direct cause of the following exception: @given(st.I_PAIRS, st.DICTS_KW_PAIRS) > def test_iteritems(arg0: t.Any, kw: t.Any) -> None: tests/property_tests/test_properties.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = ConjectureData(VALID, 27 bytes, frozen), print_example = True is_final = True expected_failure = (DeadlineExceeded('Test took 287.96ms, which exceeds the deadline of 200.00ms'), 'args = (iter([(True, None), (None, F...hich exceeds the deadline of 200.00ms\n\n/usr/lib/python3.12/site-packages/hypothesis/core.py:737: DeadlineExceeded\n') example_kwargs = None def execute_once( self, data, print_example=False, is_final=False, expected_failure=None, example_kwargs=None, ): """Run the test function once, using ``data`` as input. If the test raises an exception, it will propagate through to the caller of this method. Depending on its type, this could represent an ordinary test failure, or a fatal error, or a control exception. If this method returns normally, the test might have passed, or it might have placed ``data`` in an unsuccessful state and then swallowed the corresponding control exception. """ self.ever_executed = True data.is_find = self.is_find text_repr = None if self.settings.deadline is None: test = self.test else: @proxies(self.test) def test(*args, **kwargs): self.__test_runtime = None initial_draws = len(data.draw_times) start = time.perf_counter() result = self.test(*args, **kwargs) finish = time.perf_counter() internal_draw_time = sum(data.draw_times[initial_draws:]) runtime = datetime.timedelta( seconds=finish - start - internal_draw_time ) self.__test_runtime = runtime current_deadline = self.settings.deadline if not is_final: current_deadline = (current_deadline // 4) * 5 if runtime >= current_deadline: raise DeadlineExceeded(runtime, self.settings.deadline) return result def run(data): # Set up dynamic context needed by a single test run. with local_settings(self.settings): with deterministic_PRNG(): with BuildContext(data, is_final=is_final) as context: if self.stuff.selfy is not None: data.hypothesis_runner = self.stuff.selfy # Generate all arguments to the test function. args = self.stuff.args kwargs = dict(self.stuff.kwargs) if example_kwargs is None: a, kw, argslices = context.prep_args_kwargs_from_strategies( (), self.stuff.given_kwargs ) assert not a, "strategies all moved to kwargs by now" else: kw = example_kwargs argslices = {} kwargs.update(kw) if expected_failure is not None: nonlocal text_repr text_repr = repr_call(test, args, kwargs) if print_example or current_verbosity() >= Verbosity.verbose: printer = RepresentationPrinter(context=context) if print_example: printer.text("Falsifying example:") else: printer.text("Trying example:") if self.print_given_args: printer.text(" ") printer.repr_call( test.__name__, args, kwargs, force_split=True, arg_slices=argslices, leading_comment=( "# " + context.data.slice_comments[(0, 0)] if (0, 0) in context.data.slice_comments else None ), ) report(printer.getvalue()) return test(*args, **kwargs) # Run the test function once, via the executor hook. # In most cases this will delegate straight to `run(data)`. result = self.test_runner(data, run) # If a failure was expected, it should have been raised already, so # instead raise an appropriate diagnostic error. if expected_failure is not None: exception, traceback = expected_failure if ( isinstance(exception, DeadlineExceeded) and self.__test_runtime is not None ): report( "Unreliable test timings! On an initial run, this " "test took %.2fms, which exceeded the deadline of " "%.2fms, but on a subsequent run it took %.2f ms, " "which did not. If you expect this sort of " "variability in your test timings, consider turning " "deadlines off for this test by setting deadline=None." % ( exception.runtime.total_seconds() * 1000, self.settings.deadline.total_seconds() * 1000, self.__test_runtime.total_seconds() * 1000, ) ) else: report("Failed to reproduce exception. Expected: \n" + traceback) > raise Flaky( f"Hypothesis {text_repr} produces unreliable results: " "Falsified on the first call but did not on a subsequent one" ) from exception E hypothesis.errors.Flaky: Hypothesis test_iteritems(arg0=iter([(True, None), (None, False), (None, True), (True, None), (False, None)]), kw={}) produces unreliable results: Falsified on the first call but did not on a subsequent one E Falsifying example: test_iteritems( E arg0=PrettyIter( E [(True, None), E (None, False), E (None, True), E (True, None), E (False, None)], E ), E kw={}, E ) E Unreliable test timings! On an initial run, this test took 287.96ms, which exceeded the deadline of 200.00ms, but on a subsequent run it took 0.64 ms, which did not. If you expect this sort of variability in your test timings, consider turning deadlines off for this test by setting deadline=None. /usr/lib/python3.12/site-packages/hypothesis/core.py:814: Flaky =========================== short test summary info ============================ FAILED tests/property_tests/test_properties.py::test_unequal_order_sensitive_same_items_different_order FAILED tests/property_tests/test_properties.py::test_equal_hashables_have_same_hash FAILED tests/property_tests/test_properties.py::test_eq_correctly_defers_to_eq_of_non_mapping FAILED tests/property_tests/test_properties.py::test_unequal_to_non_mapping FAILED tests/property_tests/test_properties.py::test_equals_order_sensitive_same_items FAILED tests/property_tests/test_properties.py::test_consistency_after_method_call FAILED tests/property_tests/test_properties.py::test_setitem_with_dup_val_raises FAILED tests/property_tests/test_properties.py::test_unequal_to_mapping_with_different_items FAILED tests/property_tests/test_properties.py::test_bidict_iter - hypothesis... FAILED tests/property_tests/test_properties.py::test_put_with_dup_key_raises FAILED tests/property_tests/test_properties.py::test_merge_operators - hypoth... FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_name FAILED tests/property_tests/test_properties.py::test_bidict_reversed - hypoth... FAILED tests/property_tests/test_properties.py::test_equal_to_mapping_with_same_items FAILED tests/property_tests/test_properties.py::test_setitem_with_dup_key_val_raises FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_invalid_base_type FAILED tests/property_tests/test_properties.py::test_inverse_readonly - hypot... FAILED tests/property_tests/test_properties.py::test_namedbidict - hypothesis... FAILED tests/property_tests/test_properties.py::test_bijectivity - hypothesis... FAILED tests/property_tests/test_properties.py::test_orderedbidict_nodes_consistent FAILED tests/property_tests/test_properties.py::test_namedbidict_raises_on_same_keyname_as_valname FAILED tests/property_tests/test_properties.py::test_deepcopy - hypothesis.er... FAILED tests/property_tests/test_properties.py::test_cleared_bidicts_have_no_items FAILED tests/property_tests/test_properties.py::test_pickle - hypothesis.erro... FAILED tests/property_tests/test_properties.py::test_putall_same_as_put_for_each_item FAILED tests/property_tests/test_properties.py::test_inverted_bidict - hypoth... FAILED tests/property_tests/test_properties.py::test_frozenbidicts_hashable FAILED tests/property_tests/test_properties.py::test_views - hypothesis.error... FAILED tests/property_tests/test_properties.py::test_iteritems - hypothesis.e... ================== 29 failed, 53 passed in 273.00s (0:04:32) =================== Running Sphinx v7.2.6 making output directory... done loading intersphinx inventory from https://docs.python.org/3/objects.inv... WARNING: failed to reach any of the inventories with the following issues: intersphinx inventory 'https://docs.python.org/3/objects.inv' not fetchable due to : HTTPSConnectionPool(host='docs.python.org', port=443): Max retries exceeded with url: /3/objects.inv (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')) building [mo]: targets for 0 po files that are out of date writing output... building [doctest]: targets for 14 source files that are out of date updating environment: [new config] 14 added, 0 changed, 0 removed reading sources... [ 7%] README reading sources... [ 14%] addendum reading sources... [ 21%] api reading sources... [ 29%] basic-usage reading sources... [ 36%] changelog reading sources... [ 43%] code-of-conduct reading sources... [ 50%] contributors-guide reading sources... [ 57%] extending reading sources... [ 64%] index reading sources... [ 71%] intro reading sources... [ 79%] learning-from-bidict reading sources... [ 86%] other-bidict-types reading sources... [ 93%] other-functionality reading sources... [100%] thanks looking for now-outdated files... none found pickling environment... done checking consistency... done running tests... Document: extending ------------------- 1 items passed all tests: 53 tests in default 53 tests in 1 items. 53 passed and 0 failed. Test passed. Document: learning-from-bidict ------------------------------ 1 items passed all tests: 24 tests in default 24 tests in 1 items. 24 passed and 0 failed. Test passed. Document: basic-usage --------------------- 1 items passed all tests: 57 tests in default 57 tests in 1 items. 57 passed and 0 failed. Test passed. Document: other-bidict-types ---------------------------- 1 items passed all tests: 72 tests in default 72 tests in 1 items. 72 passed and 0 failed. Test passed. Document: addendum ------------------ 1 items passed all tests: 21 tests in default 21 tests in 1 items. 21 passed and 0 failed. Test passed. Document: intro --------------- 1 items passed all tests: 20 tests in default 20 tests in 1 items. 20 passed and 0 failed. Test passed. Document: other-functionality ----------------------------- 1 items passed all tests: 7 tests in default 7 tests in 1 items. 7 passed and 0 failed. Test passed. Doctest summary =============== 254 tests 0 failures in tests 0 failures in setup code 0 failures in cleanup code build succeeded, 1 warning. Testing of doctests in the sources finished, look at the results in docs/_build/doctest/output.txt. RPM build errors: error: Bad exit status from /var/tmp/rpm-tmp.pRfbYT (%check) %changelog not in descending chronological order Bad exit status from /var/tmp/rpm-tmp.pRfbYT (%check) Child return code was: 1 EXCEPTION: [Error('Command failed: \n # bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec\n', 1)] Traceback (most recent call last): File "/usr/lib/python3.11/site-packages/mockbuild/trace_decorator.py", line 93, in trace result = func(*args, **kw) ^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/site-packages/mockbuild/util.py", line 597, in do_with_status raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode) mockbuild.exception.Error: Command failed: # bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-bidict.spec