Mock Version: 3.5 Mock Version: 3.5 Mock Version: 3.5 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'], chrootPath='/var/lib/mock/f38-build-side-42-init-devel-592210-26145/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1674172800 Wrote: /builddir/build/SRPMS/python-datanommer-models-1.0.4-5.fc38.src.rpm Child return code was: 0 ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'], chrootPath='/var/lib/mock/f38-build-side-42-init-devel-592210-26145/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1674172800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.eSJxao + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf datanommer.models-1.0.4 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/datanommer.models-1.0.4.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd datanommer.models-1.0.4 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + rm -rf '*.egg-info' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.JQqal4 + umask 022 + cd /builddir/build/BUILD + cd datanommer.models-1.0.4 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(toml) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + RPM_TOXENV=py311 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Handling poetry-core>=1.0.0 from build-system.requires Requirement not satisfied: poetry-core>=1.0.0 Exiting dependency generation pass: build backend + rm -rfv '*.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-datanommer-models-1.0.4-5.fc38.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'], chrootPath='/var/lib/mock/f38-build-side-42-init-devel-592210-26145/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1674172800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.QhoaN0 + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf datanommer.models-1.0.4 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/datanommer.models-1.0.4.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd datanommer.models-1.0.4 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + rm -rf '*.egg-info' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.TrqSsE + umask 022 + cd /builddir/build/BUILD + cd datanommer.models-1.0.4 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(toml) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + RPM_TOXENV=py311 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Handling poetry-core>=1.0.0 from build-system.requires Requirement satisfied: poetry-core>=1.0.0 (installed: poetry-core 1.4.0) Handling SQLAlchemy (>=1.3.24,<2.0.0) from hook generated metadata: Requires-Dist Requirement not satisfied: SQLAlchemy (>=1.3.24,<2.0.0) Handling alembic (>=1.6.5,<2.0.0) from hook generated metadata: Requires-Dist Requirement not satisfied: alembic (>=1.6.5,<2.0.0) Handling anitya-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: anitya-schema ; extra == "schemas" Handling bodhi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: bodhi-messages ; extra == "schemas" Handling copr-messaging ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: copr-messaging ; extra == "schemas" Handling discourse2fedmsg-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: discourse2fedmsg-messages ; extra == "schemas" Handling fedocal-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedocal-messages ; extra == "schemas" Handling fedora-messaging (>=2.1.0) from hook generated metadata: Requires-Dist Requirement satisfied: fedora-messaging (>=2.1.0) (installed: fedora-messaging 3.3.0) Handling fedora-messaging-the-new-hotness-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-messaging-the-new-hotness-schema ; extra == "schemas" Handling fedora-planet-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-planet-messages ; extra == "schemas" Handling fedorainfra-ansible-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedorainfra-ansible-messages ; extra == "schemas" Handling mdapi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: mdapi-messages ; extra == "schemas" Handling noggin-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: noggin-messages ; extra == "schemas" Handling nuancier-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: nuancier-messages ; extra == "schemas" Handling pagure-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: pagure-messages ; extra == "schemas" Handling psycopg2 (>=2.9.1,<3.0.0) from hook generated metadata: Requires-Dist Requirement not satisfied: psycopg2 (>=2.9.1,<3.0.0) + rm -rfv datanommer_models-1.0.4.dist-info/ removed 'datanommer_models-1.0.4.dist-info/LICENSE' removed 'datanommer_models-1.0.4.dist-info/METADATA' removed 'datanommer_models-1.0.4.dist-info/WHEEL' removed directory 'datanommer_models-1.0.4.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-datanommer-models-1.0.4-5.fc38.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'], chrootPath='/var/lib/mock/f38-build-side-42-init-devel-592210-26145/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1674172800 Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.x9JCdR + umask 022 + cd /builddir/build/BUILD + cd /builddir/build/BUILD + rm -rf datanommer.models-1.0.4 + /usr/lib/rpm/rpmuncompress -x /builddir/build/SOURCES/datanommer.models-1.0.4.tar.gz + STATUS=0 + '[' 0 -ne 0 ']' + cd datanommer.models-1.0.4 + /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . + rm -rf '*.egg-info' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.JuD5Hd + umask 022 + cd /builddir/build/BUILD + cd datanommer.models-1.0.4 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(toml) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + RPM_TOXENV=py311 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Handling poetry-core>=1.0.0 from build-system.requires Requirement satisfied: poetry-core>=1.0.0 (installed: poetry-core 1.4.0) Handling SQLAlchemy (>=1.3.24,<2.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: SQLAlchemy (>=1.3.24,<2.0.0) (installed: SQLAlchemy 1.4.46) Handling alembic (>=1.6.5,<2.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: alembic (>=1.6.5,<2.0.0) (installed: alembic 1.9.3) Handling anitya-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: anitya-schema ; extra == "schemas" Handling bodhi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: bodhi-messages ; extra == "schemas" Handling copr-messaging ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: copr-messaging ; extra == "schemas" Handling discourse2fedmsg-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: discourse2fedmsg-messages ; extra == "schemas" Handling fedocal-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedocal-messages ; extra == "schemas" Handling fedora-messaging (>=2.1.0) from hook generated metadata: Requires-Dist Requirement satisfied: fedora-messaging (>=2.1.0) (installed: fedora-messaging 3.3.0) Handling fedora-messaging-the-new-hotness-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-messaging-the-new-hotness-schema ; extra == "schemas" Handling fedora-planet-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-planet-messages ; extra == "schemas" Handling fedorainfra-ansible-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedorainfra-ansible-messages ; extra == "schemas" Handling mdapi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: mdapi-messages ; extra == "schemas" Handling noggin-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: noggin-messages ; extra == "schemas" Handling nuancier-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: nuancier-messages ; extra == "schemas" Handling pagure-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: pagure-messages ; extra == "schemas" Handling psycopg2 (>=2.9.1,<3.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: psycopg2 (>=2.9.1,<3.0.0) (installed: psycopg2 2.9.3) + rm -rfv datanommer_models-1.0.4.dist-info/ removed 'datanommer_models-1.0.4.dist-info/LICENSE' removed 'datanommer_models-1.0.4.dist-info/METADATA' removed 'datanommer_models-1.0.4.dist-info/WHEEL' removed directory 'datanommer_models-1.0.4.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Wrote: /builddir/build/SRPMS/python-datanommer-models-1.0.4-5.fc38.buildreqs.nosrc.rpm Child return code was: 11 Dynamic buildrequires detected Going to install missing buildrequires. See root.log for details. ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'], chrootPath='/var/lib/mock/f38-build-side-42-init-devel-592210-26145/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=timeout=864000uid=996gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False) Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;\\007"', 'PS1': ' \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False Building target platforms: noarch Building for target noarch setting SOURCE_DATE_EPOCH=1674172800 Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.aFxPGs + umask 022 + cd /builddir/build/BUILD + cd datanommer.models-1.0.4 + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + echo pyproject-rpm-macros + echo python3-devel + echo 'python3dist(pip) >= 19' + echo 'python3dist(packaging)' + '[' -f pyproject.toml ']' + echo '(python3dist(toml) if python3-devel < 3.11)' + rm -rfv '*.dist-info/' + '[' -f /usr/bin/python3 ']' + mkdir -p /builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + RPM_TOXENV=py311 + HOSTNAME=rpmbuild + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Handling poetry-core>=1.0.0 from build-system.requires Requirement satisfied: poetry-core>=1.0.0 (installed: poetry-core 1.4.0) Handling SQLAlchemy (>=1.3.24,<2.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: SQLAlchemy (>=1.3.24,<2.0.0) (installed: SQLAlchemy 1.4.46) Handling alembic (>=1.6.5,<2.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: alembic (>=1.6.5,<2.0.0) (installed: alembic 1.9.3) Handling anitya-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: anitya-schema ; extra == "schemas" Handling bodhi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: bodhi-messages ; extra == "schemas" Handling copr-messaging ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: copr-messaging ; extra == "schemas" Handling discourse2fedmsg-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: discourse2fedmsg-messages ; extra == "schemas" Handling fedocal-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedocal-messages ; extra == "schemas" Handling fedora-messaging (>=2.1.0) from hook generated metadata: Requires-Dist Requirement satisfied: fedora-messaging (>=2.1.0) (installed: fedora-messaging 3.3.0) Handling fedora-messaging-the-new-hotness-schema ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-messaging-the-new-hotness-schema ; extra == "schemas" Handling fedora-planet-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedora-planet-messages ; extra == "schemas" Handling fedorainfra-ansible-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: fedorainfra-ansible-messages ; extra == "schemas" Handling mdapi-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: mdapi-messages ; extra == "schemas" Handling noggin-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: noggin-messages ; extra == "schemas" Handling nuancier-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: nuancier-messages ; extra == "schemas" Handling pagure-messages ; extra == "schemas" from hook generated metadata: Requires-Dist Ignoring alien requirement: pagure-messages ; extra == "schemas" Handling psycopg2 (>=2.9.1,<3.0.0) from hook generated metadata: Requires-Dist Requirement satisfied: psycopg2 (>=2.9.1,<3.0.0) (installed: psycopg2 2.9.3) + rm -rfv datanommer_models-1.0.4.dist-info/ removed 'datanommer_models-1.0.4.dist-info/LICENSE' removed 'datanommer_models-1.0.4.dist-info/METADATA' removed 'datanommer_models-1.0.4.dist-info/WHEEL' removed directory 'datanommer_models-1.0.4.dist-info/' + RPM_EC=0 ++ jobs -p + exit 0 Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.u3vCw3 + umask 022 + cd /builddir/build/BUILD + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd datanommer.models-1.0.4 + mkdir -p /builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_wheel.py /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Processing /builddir/build/BUILD/datanommer.models-1.0.4 Preparing metadata (pyproject.toml): started Running command Preparing metadata (pyproject.toml) Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: datanommer-models Building wheel for datanommer-models (pyproject.toml): started Running command Building wheel for datanommer-models (pyproject.toml) Building wheel for datanommer-models (pyproject.toml): finished with status 'done' Created wheel for datanommer-models: filename=datanommer_models-1.0.4-py3-none-any.whl size=23723 sha256=6e2c9ada1c4617373b90c48e362dd424af6271363400ff1da941da543a5023cc Stored in directory: /builddir/.cache/pip/wheels/e6/6b/44/08d3d60541b36617945e80013bdb6fe517a5a77368753ef336 Successfully built datanommer-models + RPM_EC=0 ++ jobs -p + exit 0 Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.5cypoU + umask 022 + cd /builddir/build/BUILD + '[' /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch '!=' / ']' + rm -rf /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch ++ dirname /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch + mkdir -p /builddir/build/BUILDROOT + mkdir /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd datanommer.models-1.0.4 ++ xargs basename --multiple ++ ls /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir/datanommer_models-1.0.4-py3-none-any.whl ++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/' + specifier=datanommer_models==1.0.4 + TMPDIR=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir + /usr/bin/python3 -m pip install --root /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch --prefix /usr --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir datanommer_models==1.0.4 Using pip 22.3.1 from /usr/lib/python3.11/site-packages/pip (python 3.11) Looking in links: /builddir/build/BUILD/datanommer.models-1.0.4/pyproject-wheeldir Processing ./pyproject-wheeldir/datanommer_models-1.0.4-py3-none-any.whl Installing collected packages: datanommer_models Successfully installed datanommer_models-1.0.4 + '[' -d /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/bin ']' + rm -f /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-ghost-distinfo + site_dirs=() + '[' -d /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages ']' + site_dirs+=("/usr/lib/python3.11/site-packages") + '[' /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib64/python3.11/site-packages '!=' /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages ']' + '[' -d /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib64/python3.11/site-packages ']' + for site_dir in ${site_dirs[@]} + for distinfo in /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch$site_dir/*.dist-info + echo '%ghost /usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info' + sed -i s/pip/rpm/ /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/INSTALLER + PYTHONPATH=/usr/lib/rpm/redhat + /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch --record /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/RECORD --output /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-record + rm -fv /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/RECORD removed '/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/RECORD' + rm -fv /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/REQUESTED removed '/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer_models-1.0.4.dist-info/REQUESTED' ++ wc -l /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-ghost-distinfo ++ cut -f1 '-d ' + lines=1 + '[' 1 -ne 1 ']' + /usr/bin/python3 /usr/lib/rpm/redhat/pyproject_save_files.py --output-files /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-files --output-modules /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-modules --buildroot /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch --sitelib /usr/lib/python3.11/site-packages --sitearch /usr/lib64/python3.11/site-packages --python-version 3.11 --pyproject-record /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-record --prefix /usr datanommer + /usr/bin/mkdir -p /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/etc/datanommer-models + install -m 644 alembic.ini /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/etc/datanommer-models/alembic.ini + /usr/bin/find-debuginfo -j8 --strict-build-id -m -i --build-id-seed 1.0.4-5.fc38 --unique-debug-suffix -1.0.4-5.fc38.noarch --unique-debug-src-base python-datanommer-models-1.0.4-5.fc38.noarch --run-dwz --dwz-low-mem-die-limit 10000000 --dwz-max-die-limit 50000000 -S debugsourcefiles.list /builddir/build/BUILD/datanommer.models-1.0.4 find: 'debug': No such file or directory + /usr/lib/rpm/check-buildroot + /usr/lib/rpm/redhat/brp-ldconfig + /usr/lib/rpm/brp-compress + /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip + /usr/lib/rpm/brp-strip-static-archive /usr/bin/strip + /usr/lib/rpm/check-rpaths + /usr/lib/rpm/redhat/brp-mangle-shebangs + /usr/lib/rpm/brp-remove-la-files + env /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0 -j8 Bytecompiling .py files below /builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11 using python3.11 + /usr/lib/rpm/redhat/brp-python-hardlink Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.xv2SVi + umask 022 + cd /builddir/build/BUILD + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CFLAGS + CXXFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + export CXXFLAGS + FFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FFLAGS + FCFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer -I/usr/lib/gfortran/modules ' + export FCFLAGS + VALAFLAGS=-g + export VALAFLAGS + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + export LDFLAGS + LT_SYS_LIBRARY_PATH=/usr/lib: + export LT_SYS_LIBRARY_PATH + CC=gcc + export CC + CXX=g++ + export CXX + cd datanommer.models-1.0.4 + '[' '!' -f /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-modules ']' + PATH=/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin + PYTHONPATH=/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib64/python3.11/site-packages:/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages + _PYTHONSITE=/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib64/python3.11/site-packages:/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages + PYTHONDONTWRITEBYTECODE=1 + /usr/bin/python3 -sP /usr/lib/rpm/redhat/import_all_modules.py -f /builddir/build/BUILD/python-datanommer-models-1.0.4-5.fc38.noarch-pyproject-modules -e datanommer.models.alembic.env Check import: datanommer Check import: datanommer.models Check import: datanommer.models.alembic Check import: datanommer.models.testing + CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-U_FORTIFY_SOURCE,-D_FORTIFY_SOURCE=3 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -fasynchronous-unwind-tables -fstack-clash-protection -fno-omit-frame-pointer ' + LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -Wl,--build-id=sha1 ' + PATH=/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin + PYTHONPATH=/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib64/python3.11/site-packages:/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages + PYTHONDONTWRITEBYTECODE=1 + PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/datanommer.models-1.0.4/.pyproject-builddir' + PYTEST_XDIST_AUTO_NUM_WORKERS=8 + /usr/bin/pytest -v ============================= test session starts ============================== platform linux -- Python 3.11.2, pytest-7.2.2, pluggy-1.0.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /builddir/build/BUILD/datanommer.models-1.0.4 plugins: mock-3.10.0, postgresql-3.1.3 collecting ... collected 48 items tests/test_jsonencodeddict.py::test_jsonencodeddict PASSED [ 2%] tests/test_jsonencodeddict.py::test_jsonencodeddict_null PASSED [ 4%] tests/test_jsonencodeddict.py::test_jsonencodeddict_compare PASSED [ 6%] tests/test_jsonencodeddict.py::test_jsonencodeddict_compare_like PASSED [ 8%] tests/test_model.py::test_init_uri_and_engine PASSED [ 10%] tests/test_model.py::test_init_no_uri_and_no_engine PASSED [ 12%] tests/test_model.py::test_init_with_engine PASSED [ 14%] tests/test_model.py::test_init_no_init_twice PASSED [ 16%] tests/test_model.py::test_unclassified_category FAILED [ 18%] tests/test_model.py::test_from_msg_id FAILED [ 20%] tests/test_model.py::test_add_missing_msg_id FAILED [ 22%] tests/test_model.py::test_add_missing_timestamp FAILED [ 25%] tests/test_model.py::test_add_timestamp_with_Z FAILED [ 27%] tests/test_model.py::test_add_timestamp_with_junk PASSED [ 29%] tests/test_model.py::test_add_and_check_for_others FAILED [ 31%] tests/test_model.py::test_add_nothing PASSED [ 33%] tests/test_model.py::test_add_and_check FAILED [ 35%] tests/test_model.py::test_categories FAILED [ 37%] tests/test_model.py::test_categories_with_umb FAILED [ 39%] tests/test_model.py::test_grep_all FAILED [ 41%] tests/test_model.py::test_grep_category FAILED [ 43%] tests/test_model.py::test_grep_not_category FAILED [ 45%] tests/test_model.py::test_add_headers FAILED [ 47%] tests/test_model.py::test_grep_topics FAILED [ 50%] tests/test_model.py::test_grep_not_topics FAILED [ 52%] tests/test_model.py::test_grep_start_end_validation PASSED [ 54%] tests/test_model.py::test_grep_start_end FAILED [ 56%] tests/test_model.py::test_grep_msg_id FAILED [ 58%] tests/test_model.py::test_grep_users FAILED [ 60%] tests/test_model.py::test_grep_not_users FAILED [ 62%] tests/test_model.py::test_grep_packages FAILED [ 64%] tests/test_model.py::test_grep_not_packages FAILED [ 66%] tests/test_model.py::test_grep_contains FAILED [ 68%] tests/test_model.py::test_grep_rows_per_page_none FAILED [ 70%] tests/test_model.py::test_grep_rows_per_page_zero FAILED [ 72%] tests/test_model.py::test_grep_defer FAILED [ 75%] tests/test_model.py::test_add_duplicate FAILED [ 77%] tests/test_model.py::test_add_integrity_error PASSED [ 79%] tests/test_model.py::test_add_duplicate_package FAILED [ 81%] tests/test_model.py::test_add_message_with_error_on_packages FAILED [ 83%] tests/test_model.py::test_as_fedora_message_dict FAILED [ 85%] tests/test_model.py::test_as_fedora_message_dict_old_headers FAILED [ 87%] tests/test_model.py::test_as_fedora_message_dict_no_headers FAILED [ 89%] tests/test_model.py::test_as_dict FAILED [ 91%] tests/test_model.py::test_as_dict_with_users_and_packages FAILED [ 93%] tests/test_model.py::test___json__deprecated FAILED [ 95%] tests/test_model.py::test_singleton_create PASSED [ 97%] tests/test_model.py::test_singleton_get_existing PASSED [100%] =================================== FAILURES =================================== __________________________ test_unclassified_category __________________________ self = dialect = constructor = > statement = parameters = [{'category': 'Unclassified', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:00+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 1]}}) args = (, [{'category': 'Unclassified...erity': 20, 'sent-at': '2023-05-09T02:46:00+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 1]}}) compiled = parameters = [{'category': 'Unclassified', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:00+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_unclassified_category(datanommer_models): example_message = generate_message(topic="too.short") > add(example_message) tests/test_model.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'too.short', 'timestamp': datetime.datetime(2023, 5, 9, 2, 46, tzinfo=datetime.timezone.utc), 'i': 0, 'headers': {'fedora_messaging_schema' ... (143 characters truncated) ... 8-44f3-ab36-7ae820e8c20f', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ----------------------------- Captured stderr call ----------------------------- Traceback (most recent call last): File "/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py", line 258, in get_category self.category = topic.split(".")[index] ~~~~~~~~~~~~~~~~^^^^^^^ IndexError: list index out of range _______________________________ test_from_msg_id _______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:15+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 2]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:46:15+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 2]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:15+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_from_msg_id(datanommer_models): example_message = generate_message() example_message.id = "ACUSTOMMESSAGEID" > add(example_message) tests/test_model.py:106: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 46, 15, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (144 characters truncated) ... g_id': 'ACUSTOMMESSAGEID', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ___________________________ test_add_missing_msg_id ____________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:28+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 3]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:46:28+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 3]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:46:28+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = caplog = <_pytest.logging.LogCaptureFixture object at 0xffffff98217190> def test_add_missing_msg_id(datanommer_models, caplog): caplog.set_level(logging.INFO) example_message = generate_message() example_message._properties.message_id = None > add(example_message) tests/test_model.py:116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 46, 28, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... 6-4b4a-8ec4-096fadfad51c', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ------------------------------ Captured log call ------------------------------- INFO datanommer:__init__.py:269 Message on org.fedoraproject.test.a.nice.message was received without a msg_id __________________________ test_add_missing_timestamp __________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': None}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 4]}}) args = (, [{'category': 'a', 'certifi..., 'fedora_messaging_severity': 20, 'sent-at': None}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 4]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': None}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_missing_timestamp(datanommer_models): example_message = generate_message() example_message._properties.headers["sent-at"] = None > add(example_message) tests/test_model.py:129: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 46, 41, 298427), 'i': 0, 'headers': {'fedora_messagi ... (119 characters truncated) ... 6-4c49-b2d0-a6b82c249c5c', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError __________________________ test_add_timestamp_with_Z ___________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2021-07-27T04:22:42Z'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 5]}}) args = (, [{'category': 'a', 'certifi...g_severity': 20, 'sent-at': '2021-07-27T04:22:42Z'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 5]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2021-07-27T04:22:42Z'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_timestamp_with_Z(datanommer_models): example_message = generate_message() example_message._properties.headers["sent-at"] = "2021-07-27T04:22:42Z" > add(example_message) tests/test_model.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2021, 7, 27, 4, 22, 42, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (160 characters truncated) ... d-4583-97ca-24748106ce2c', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ________________________ test_add_and_check_for_others _________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 6]}}) args = (, [{'category': 'bodhi', 'cer....comment.v1', 'fedora_messaging_severity': 20, ...}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 6]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_and_check_for_others(datanommer_models): # There are no users or packages at the start assert User.query.count() == 0 assert Package.query.count() == 0 # Then add a message > add(generate_bodhi_update_complete_message()) tests/test_model.py:168: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.stg.bodhi.update.comment', 'timestamp': datetime.datetime(2023, 5, 9, 2, 47, 10, tzinfo=datetime.timezone.utc), 'i': 0, ... (679 characters truncated) ... esting', 'user': {'name': 'ryanlerch'}}, 'user': {'name': 'dudemcpants'}}}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ______________________________ test_add_and_check ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:47:27+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 7]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:47:27+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 7]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:47:27+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_and_check(datanommer_models): > add(generate_message()) tests/test_model.py:190: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 47, 27, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... 7-4b42-bb7f-5d80e216f543', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_categories ________________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 8]}}) args = (, [{'category': 'bodhi', 'cer....comment.v1', 'fedora_messaging_severity': 20, ...}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 8]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_categories(datanommer_models): > add(generate_bodhi_update_complete_message()) tests/test_model.py:196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.stg.bodhi.update.comment', 'timestamp': datetime.datetime(2023, 5, 9, 2, 47, 40, tzinfo=datetime.timezone.utc), 'i': 0, ... (679 characters truncated) ... esting', 'user': {'name': 'ryanlerch'}}, 'user': {'name': 'dudemcpants'}}}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ___________________________ test_categories_with_umb ___________________________ self = dialect = constructor = > statement = parameters = [{'category': 'brew', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:47:52+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 9]}}) args = (, [{'category': 'brew', 'cert...erity': 20, 'sent-at': '2023-05-09T02:47:52+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 9]}}) compiled = parameters = [{'category': 'brew', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:47:52+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_categories_with_umb(datanommer_models): > add(generate_message(topic="/topic/VirtualTopic.eng.brew.task.closed")) tests/test_model.py:203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': '/topic/VirtualTopic.eng.brew.task.closed', 'timestamp': datetime.datetime(2023, 5, 9, 2, 47, 52, tzinfo=datetime.timezone.utc), 'i': 0, 'h ... (170 characters truncated) ... c-49f7-bdef-56202366c2cd', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ________________________________ test_grep_all _________________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:05+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 10]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:48:05+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 10]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:05+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_all(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 48, 5, tzinfo=datetime.timezone.utc), 'i': 0, 'heade ... (163 characters truncated) ... 6-433e-a8bd-c2f9f1910685', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ______________________________ test_grep_category ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:17+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 11]}}) args = (, [{'category': 'bodhi', 'cer...erity': 20, 'sent-at': '2023-05-09T02:48:17+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 11]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:17+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_category(datanommer_models): example_message = generate_message(topic="org.fedoraproject.prod.bodhi.newupdate") > add(example_message) tests/test_model.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 48, 17, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (169 characters truncated) ... 2-45e9-8807-dd4073e18ee3', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ____________________________ test_grep_not_category ____________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:30+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 12]}}) args = (, [{'category': 'bodhi', 'cer...erity': 20, 'sent-at': '2023-05-09T02:48:30+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 12]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:30+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_not_category(datanommer_models): example_message = generate_message(topic="org.fedoraproject.prod.bodhi.newupdate") > add(example_message) tests/test_model.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 48, 30, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (169 characters truncated) ... 4-4997-af24-9a38d3d23109', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_add_headers _______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'baz': 1, 'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'foo': 'bar', ...}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 13]}}) args = (, [{'category': 'bodhi', 'cer...'fedora_messaging_severity': 20, 'foo': 'bar', ...}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 13]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'baz': 1, 'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'foo': 'bar', ...}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_headers(datanommer_models): example_headers = {"foo": "bar", "baz": 1, "wibble": ["zork", "zap"]} example_message = generate_message( topic="org.fedoraproject.prod.bodhi.newupdate", headers=example_headers ) > add(example_message) tests/test_model.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 48, 43, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (220 characters truncated) ... f-422e-881a-1422949ba60c', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_grep_topics _______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:56+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 14]}}) args = (, [{'category': 'bodhi', 'cer...erity': 20, 'sent-at': '2023-05-09T02:48:56+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 14]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:48:56+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_topics(datanommer_models): example_message = generate_message(topic="org.fedoraproject.prod.bodhi.newupdate") > add(example_message) tests/test_model.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 48, 56, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (169 characters truncated) ... 9-483e-b297-8acf776e821c', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _____________________________ test_grep_not_topics _____________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:09+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 15]}}) args = (, [{'category': 'bodhi', 'cer...erity': 20, 'sent-at': '2023-05-09T02:49:09+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 15]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:09+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_not_topics(datanommer_models): example_message = generate_message(topic="org.fedoraproject.prod.bodhi.newupdate") > add(example_message) tests/test_model.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 49, 9, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (168 characters truncated) ... b-4a19-b300-a98793b88906', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _____________________________ test_grep_start_end ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2021-04-01T00:00:01'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 16]}}) args = (, [{'category': 'a', 'certifi...ng_severity': 20, 'sent-at': '2021-04-01T00:00:01'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 16]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2021-04-01T00:00:01'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_start_end(datanommer_models): example_message = generate_message() example_message._properties.headers["sent-at"] = "2021-04-01T00:00:01" > add(example_message) tests/test_model.py:290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2021, 4, 1, 0, 0, 1), 'i': 0, 'headers': {'fedora_messaging_schema' ... (126 characters truncated) ... 3-466f-8895-0edf5f894579', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_grep_msg_id _______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:40+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 17]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:49:40+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 17]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:40+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_msg_id(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:312: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 49, 40, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... c-48bf-8cda-cdcd41f8171b', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_grep_users ________________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:54+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 18]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:49:54+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 18]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:49:54+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_users(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:338: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 49, 54, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... 4-4dd7-be35-caa2c545d35a', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _____________________________ test_grep_not_users ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:07+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 19]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:50:07+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 19]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:07+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_not_users(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 50, 7, tzinfo=datetime.timezone.utc), 'i': 0, 'heade ... (163 characters truncated) ... c-42b9-8fbd-8686c38cda40', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ______________________________ test_grep_packages ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:20+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 20]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:50:20+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 20]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:20+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_packages(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:374: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 50, 20, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... a-4c9c-b24d-990e74991758', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ____________________________ test_grep_not_packages ____________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:34+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 21]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:50:34+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 21]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:34+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_not_packages(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:392: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 50, 34, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... 1-4ba9-b32e-84a4ae84cac7', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ______________________________ test_grep_contains ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:48+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 22]}}) args = (, [{'category': 'bodhi', 'cer...erity': 20, 'sent-at': '2023-05-09T02:50:48+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 22]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:50:48+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_contains(datanommer_models): example_message = generate_message(topic="org.fedoraproject.prod.bodhi.newupdate") > add(example_message) tests/test_model.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.prod.bodhi.newupdate', 'timestamp': datetime.datetime(2023, 5, 9, 2, 50, 48, tzinfo=datetime.timezone.utc), 'i': 0, 'hea ... (169 characters truncated) ... 1-4d88-8aeb-3605d72ee7bc', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _________________________ test_grep_rows_per_page_none _________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:01+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 23]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:51:01+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 23]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:01+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_rows_per_page_none(datanommer_models): for x in range(0, 200): example_message = generate_message() example_message.id = f"{x}" > add(example_message) tests/test_model.py:423: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 51, 1, tzinfo=datetime.timezone.utc), 'i': 0, 'heade ... (128 characters truncated) ... gory': 'a', 'msg_id': '0', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _________________________ test_grep_rows_per_page_zero _________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:15+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 24]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:51:15+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 24]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:15+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_rows_per_page_zero(datanommer_models): for x in range(0, 200): example_message = generate_message() example_message.id = f"{x}" > add(example_message) tests/test_model.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 51, 15, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (129 characters truncated) ... gory': 'a', 'msg_id': '0', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _______________________________ test_grep_defer ________________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:28+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 25]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:51:28+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 25]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:28+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_grep_defer(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:456: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 51, 28, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... d-43b6-b184-4fa4798065a8', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ______________________________ test_add_duplicate ______________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:42+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 26]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:51:42+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 26]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:51:42+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = caplog = <_pytest.logging.LogCaptureFixture object at 0xffffff96ca4390> def test_add_duplicate(datanommer_models, caplog): example_message = generate_message() > add(example_message) tests/test_model.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 51, 42, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... a-492c-9338-d92137f18cb5', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError __________________________ test_add_duplicate_package __________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_pkg': True, 'fedora_messaging_schema': 'MessageWithPackages', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:00+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 27]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:52:00+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 27]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_pkg': True, 'fedora_messaging_schema': 'MessageWithPackages', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:00+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_add_duplicate_package(datanommer_models): # Define a special message schema and register it class MessageWithPackages(fedora_message.Message): @property def packages(self): return ["pkg", "pkg"] fedora_message._schema_name_to_class["MessageWithPackages"] = MessageWithPackages fedora_message._class_to_schema_name[MessageWithPackages] = "MessageWithPackages" example_message = MessageWithPackages( topic="org.fedoraproject.test.a.nice.message", body={"encouragement": "You're doing great!"}, headers=None, ) try: > add(example_message) tests/test_model.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 52, tzinfo=datetime.timezone.utc), 'i': 0, 'headers' ... (201 characters truncated) ... 2-4ca3-8d3f-e30c916fa3de', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ___________________ test_add_message_with_error_on_packages ____________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'CustomMessage', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:14+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 28]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:52:14+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 28]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'CustomMessage', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:14+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = caplog = <_pytest.logging.LogCaptureFixture object at 0xffffff96c39510> def test_add_message_with_error_on_packages(datanommer_models, caplog): # Define a special message schema and register it class CustomMessage(fedora_message.Message): @property def packages(self): raise KeyError def _filter_headers(self): return {} fedora_message._schema_name_to_class["CustomMessage"] = CustomMessage fedora_message._class_to_schema_name[CustomMessage] = "CustomMessage" example_message = CustomMessage( topic="org.fedoraproject.test.a.nice.message", body={"encouragement": "You're doing great!"}, headers=None, ) try: > add(example_message) tests/test_model.py:530: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 52, 14, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (165 characters truncated) ... 1-41a4-8408-6442888f5d67', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ------------------------------ Captured log call ------------------------------- ERROR datanommer:__init__.py:140 Could not get the list of packages from a message on org.fedoraproject.test.a.nice.message with id 1523f796-7791-41a4-8408-6442888f5d67 Traceback (most recent call last): File "/builddir/build/BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py", line 138, in add packages = message.packages ^^^^^^^^^^^^^^^^ File "/builddir/build/BUILD/datanommer.models-1.0.4/tests/test_model.py", line 517, in packages raise KeyError KeyError _________________________ test_as_fedora_message_dict __________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:28+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 29]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:52:28+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 29]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:28+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_as_fedora_message_dict(datanommer_models): example_message = generate_message() > add(example_message) tests/test_model.py:542: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 52, 28, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... a-4ae1-92b2-408b3ef6d6e7', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ___________________ test_as_fedora_message_dict_old_headers ____________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:42+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 30]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:52:42+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 30]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:42+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_as_fedora_message_dict_old_headers(datanommer_models): # Messages received with fedmsg don't have the sent-at header example_message = generate_message() > add(example_message) tests/test_model.py:555: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 52, 42, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... e-4ba8-a4e5-372c0ea22d5d', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ____________________ test_as_fedora_message_dict_no_headers ____________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:56+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 31]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:52:56+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 31]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:52:56+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_as_fedora_message_dict_no_headers(datanommer_models): # Messages can have no headers example_message = generate_message() > add(example_message) tests/test_model.py:571: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 52, 56, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... 7-4aeb-9785-9be1d962a9be', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _________________________________ test_as_dict _________________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:53:10+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 32]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:53:10+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 32]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:53:10+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_as_dict(datanommer_models): > add(generate_message()) tests/test_model.py:588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 53, 10, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... d-47c8-9b67-5f41a9b9d3e1', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError _____________________ test_as_dict_with_users_and_packages _____________________ self = dialect = constructor = > statement = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 33]}}) args = (, [{'category': 'bodhi', 'cer....comment.v1', 'fedora_messaging_severity': 20, ...}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 33]}}) compiled = parameters = [{'category': 'bodhi', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_rpm_abrt-addon-python3': Tru...g_rpm_kernel': True, 'fedora_messaging_schema': 'bodhi.update.comment.v1', 'fedora_messaging_severity': 20, ...}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = def test_as_dict_with_users_and_packages(datanommer_models): > add(generate_bodhi_update_complete_message()) tests/test_model.py:599: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.stg.bodhi.update.comment', 'timestamp': datetime.datetime(2023, 5, 9, 2, 53, 24, tzinfo=datetime.timezone.utc), 'i': 0, ... (679 characters truncated) ... esting', 'user': {'name': 'ryanlerch'}}, 'user': {'name': 'dudemcpants'}}}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError ___________________________ test___json__deprecated ____________________________ self = dialect = constructor = > statement = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:53:38+00:00'}, ...}] execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 1]}}) args = (, [{'category': 'a', 'certifi...erity': 20, 'sent-at': '2023-05-09T02:53:38+00:00'}, ...}], , []) kw = {'cache_hit': symbol('CACHE_MISS')} branched = yp = None conn = def _execute_context( self, dialect, constructor, statement, parameters, execution_options, *args, **kw ): """Create an :class:`.ExecutionContext` and execute, returning a :class:`_engine.CursorResult`.""" branched = self if self.__branch_from: # if this is a "branched" connection, do everything in terms # of the "root" connection, *except* for .close(), which is # the only feature that branching provides self = self.__branch_from if execution_options: yp = execution_options.get("yield_per", None) if yp: execution_options = execution_options.union( {"stream_results": True, "max_row_buffer": yp} ) try: conn = self._dbapi_connection if conn is None: conn = self._revalidate_connection() > context = constructor( dialect, self, conn, execution_options, *args, **kw ) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = dialect = connection = dbapi_connection = execution_options = immutabledict({'autocommit': True, 'compiled_cache': {(, 1]}}) compiled = parameters = [{'category': 'a', 'certificate': None, 'crypto': None, 'headers': {'fedora_messaging_schema': 'base.message', 'fedora_messaging_severity': 20, 'sent-at': '2023-05-09T02:53:38+00:00'}, ...}] invoked_statement = extracted_parameters = [], cache_hit = symbol('CACHE_MISS') @classmethod def _init_compiled( cls, dialect, connection, dbapi_connection, execution_options, compiled, parameters, invoked_statement, extracted_parameters, cache_hit=CACHING_DISABLED, ): """Initialize execution context for a Compiled construct.""" self = cls.__new__(cls) self.root_connection = connection self._dbapi_connection = dbapi_connection self.dialect = connection.dialect self.extracted_parameters = extracted_parameters self.invoked_statement = invoked_statement self.compiled = compiled self.cache_hit = cache_hit self.execution_options = execution_options self._is_future_result = ( connection._is_future or self.execution_options.get("future_result", False) ) self.result_column_struct = ( compiled._result_columns, compiled._ordered_columns, compiled._textual_ordered_columns, compiled._ad_hoc_textual, compiled._loose_column_name_matching, ) self.isinsert = compiled.isinsert self.isupdate = compiled.isupdate self.isdelete = compiled.isdelete self.is_text = compiled.isplaintext if self.isinsert or self.isupdate or self.isdelete: self.is_crud = True self._is_explicit_returning = bool(compiled.statement._returning) self._is_implicit_returning = bool( compiled.returning and not compiled.statement._returning ) if not parameters: self.compiled_parameters = [ compiled.construct_params( extracted_parameters=extracted_parameters, escape_names=False, ) ] else: self.compiled_parameters = [ compiled.construct_params( m, escape_names=False, _group_number=grp, extracted_parameters=extracted_parameters, ) for grp, m in enumerate(parameters) ] self.executemany = len(parameters) > 1 # this must occur before create_cursor() since the statement # has to be regexed in some cases for server side cursor if util.py2k: self.unicode_statement = util.text_type(compiled.string) else: self.unicode_statement = compiled.string self.cursor = self.create_cursor() if self.compiled.insert_prefetch or self.compiled.update_prefetch: if self.executemany: self._process_executemany_defaults() else: > self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _process_executesingle_defaults(self): key_getter = self.compiled._within_exec_param_key_getter self.current_parameters = ( compiled_parameters ) = self.compiled_parameters[0] for c in self.compiled.insert_prefetch: if c.default and not c.default.is_sequence and c.default.is_scalar: val = c.default.arg else: > val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.primary_key and column is column.table._autoincrement_column: if column.server_default and column.server_default.has_argument: # pre-execute passive defaults on primary key columns return self._execute_scalar( "select %s" % column.server_default.arg, column.type ) elif column.default is None or ( column.default.is_sequence and column.default.optional ): # execute the sequence associated with a SERIAL primary # key column. for non-primary-key SERIAL, the ID just # generates server side. try: seq_name = column._postgresql_seq_name except AttributeError: tab = column.table.name col = column.name tab = tab[0 : 29 + max(0, (29 - len(col)))] col = col[0 : 29 + max(0, (29 - len(tab)))] name = "%s_%s_seq" % (tab, col) column._postgresql_seq_name = seq_name = name if column.table is not None: effective_schema = self.connection.schema_for_object( column.table ) else: effective_schema = None if effective_schema is not None: exc = 'select nextval(\'"%s"."%s"\')' % ( effective_schema, seq_name, ) else: exc = "select nextval('\"%s\"')" % (seq_name,) return self._execute_scalar(exc, column.type) > return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) def get_insert_default(self, column): if column.default is None: return None else: > return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = column = Column('source_version', Unicode(), table=, default=ColumnDefault()) default = ColumnDefault() type_ = Unicode() def _exec_default(self, column, default, type_): if default.is_sequence: return self.fire_sequence(default, type_) elif default.is_callable: self.current_column = column > return default.arg(self) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ context = def source_version_default(context): > dist = pkg_resources.get_distribution("datanommer.models") ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ dist = Requirement.parse('datanommer.models') def get_distribution(dist): """Return a current distribution object for a Requirement or string""" if isinstance(dist, str): dist = Requirement.parse(dist) if isinstance(dist, Requirement): > dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ moduleOrReq = Requirement.parse('datanommer.models') def get_provider(moduleOrReq): """Return an IResourceProvider for the named module or requirement""" if isinstance(moduleOrReq, Requirement): > return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = requirements = ('datanommer.models',) def require(self, *requirements): """Ensure that distributions matching `requirements` are activated `requirements` must be a string or a (possibly-nested) sequence thereof, specifying the distributions and versions required. The return value is a sequence of the distributions that needed to be activated to fulfill the requirements; all relevant distributions are included, even if they were already activated in this working set. """ > needed = self.resolve(parse_requirements(requirements)) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E pkg_resources.DistributionNotFound: The 'datanommer.models' distribution was not found and is required by the application /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: DistributionNotFound The above exception was the direct cause of the following exception: datanommer_models = caplog = <_pytest.logging.LogCaptureFixture object at 0xffffff967fcf50> mocker = def test___json__deprecated(datanommer_models, caplog, mocker): mock_as_dict = mocker.patch("datanommer.models.Message.as_dict") > add(generate_message()) tests/test_model.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:147: in add Message.create( ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:275: in create session.flush() :2: in flush ??? /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3444: in flush self._flush(objects) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3583: in _flush with util.safe_reraise(): /usr/lib64/python3.11/site-packages/sqlalchemy/util/langhelpers.py:70: in __exit__ compat.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/orm/session.py:3544: in _flush flush_context.execute() /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:456: in execute rec.execute(self) /usr/lib64/python3.11/site-packages/sqlalchemy/orm/unitofwork.py:630: in execute util.preloaded.orm_persistence.save_obj( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:245: in save_obj _emit_insert_statements( /usr/lib64/python3.11/site-packages/sqlalchemy/orm/persistence.py:1238: in _emit_insert_statements result = connection._execute_20( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1705: in _execute_20 return meth(self, args_10style, kwargs_10style, execution_options) /usr/lib64/python3.11/site-packages/sqlalchemy/sql/elements.py:334: in _execute_on_connection return connection._execute_clauseelement( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1572: in _execute_clauseelement ret = self._execute_context( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1806: in _execute_context self._handle_dbapi_exception( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:2124: in _handle_dbapi_exception util.raise_( /usr/lib64/python3.11/site-packages/sqlalchemy/util/compat.py:211: in raise_ raise exception /usr/lib64/python3.11/site-packages/sqlalchemy/engine/base.py:1800: in _execute_context context = constructor( /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1026: in _init_compiled self._process_executesingle_defaults() /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1928: in _process_executesingle_defaults val = self.get_insert_default(c) /usr/lib64/python3.11/site-packages/sqlalchemy/dialects/postgresql/base.py:3297: in get_insert_default return super(PGExecutionContext, self).get_insert_default(column) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1872: in get_insert_default return self._exec_default(column, column.default, column.type) /usr/lib64/python3.11/site-packages/sqlalchemy/engine/default.py:1736: in _exec_default return default.arg(self) ../../BUILDROOT/python-datanommer-models-1.0.4-5.fc38.noarch/usr/lib/python3.11/site-packages/datanommer/models/__init__.py:162: in source_version_default dist = pkg_resources.get_distribution("datanommer.models") /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:478: in get_distribution dist = get_provider(dist) /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:354: in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:909: in require needed = self.resolve(parse_requirements(requirements)) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , requirements = [] env = , installer = None replace_conflicting = False, extras = None def resolve(self, requirements, env=None, installer=None, # noqa: C901 replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, if supplied, should be an ``Environment`` instance. If not supplied, it defaults to all distributions available within any entry or distribution in the working set. `installer`, if supplied, will be invoked with each requirement that cannot be met by an already-installed distribution; it should return a ``Distribution`` or ``None``. Unless `replace_conflicting=True`, raises a VersionConflict exception if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. `extras` is a list of the extras to be used with these requirements. This is important because extra requirements may look like `my_req; extra = "my_extra"`, which would otherwise be interpreted as a purely optional requirement. Instead, we want to be able to assert that these requirements are truly required. """ # set up the stack requirements = list(requirements)[::-1] # set of processed requirements processed = {} # key -> dist best = {} to_activate = [] req_extras = _ReqExtras() # Mapping of requirement to set of distributions that required it; # useful for reporting info about conflicts. required_by = collections.defaultdict(set) while requirements: # process dependencies breadth-first req = requirements.pop(0) if req in processed: # Ignore cyclic or redundant dependencies continue if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) if dist is None: # Find the best distribution and add it to the map dist = self.by_key.get(req.key) if dist is None or (dist not in req and replace_conflicting): ws = self if env is None: if dist is None: env = Environment(self.entries) else: # Use an empty environment and workingset to avoid # any further conflicts with the conflicting # distribution env = Environment([]) ws = WorkingSet([]) dist = best[req.key] = env.best_match( req, ws, installer, replace_conflicting=replace_conflicting ) if dist is None: requirers = required_by.get(req, None) > raise DistributionNotFound(req, requirers) E sqlalchemy.exc.StatementError: (pkg_resources.DistributionNotFound) The 'datanommer.models' distribution was not found and is required by the application E [SQL: INSERT INTO messages (msg_id, i, topic, timestamp, certificate, signature, category, username, crypto, source_name, source_version, msg, headers) VALUES (%(msg_id)s, %(i)s, %(topic)s, %(timestamp)s, %(certificate)s, %(signature)s, %(category)s, %(username)s, %(crypto)s, %(source_name)s, %(source_version)s, %(msg)s, %(headers)s) RETURNING messages.id] E [parameters: [{'topic': 'org.fedoraproject.test.a.nice.message', 'timestamp': datetime.datetime(2023, 5, 9, 2, 53, 38, tzinfo=datetime.timezone.utc), 'i': 0, 'head ... (164 characters truncated) ... a-40c6-8059-2b06f4ed1ed6', 'msg': {'encouragement': "You're doing great!"}, 'username': None, 'crypto': None, 'certificate': None, 'signature': None}]] /usr/lib/python3.11/site-packages/pkg_resources/__init__.py:795: StatementError =============================== warnings summary =============================== tests/test_jsonencodeddict.py::test_jsonencodeddict /builddir/build/BUILD/datanommer.models-1.0.4/tests/test_jsonencodeddict.py:24: RemovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9) metadata.create_all(connection) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED tests/test_model.py::test_unclassified_category - sqlalchemy.exc.State... FAILED tests/test_model.py::test_from_msg_id - sqlalchemy.exc.StatementError:... FAILED tests/test_model.py::test_add_missing_msg_id - sqlalchemy.exc.Statemen... FAILED tests/test_model.py::test_add_missing_timestamp - sqlalchemy.exc.State... FAILED tests/test_model.py::test_add_timestamp_with_Z - sqlalchemy.exc.Statem... FAILED tests/test_model.py::test_add_and_check_for_others - sqlalchemy.exc.St... FAILED tests/test_model.py::test_add_and_check - sqlalchemy.exc.StatementErro... FAILED tests/test_model.py::test_categories - sqlalchemy.exc.StatementError: ... FAILED tests/test_model.py::test_categories_with_umb - sqlalchemy.exc.Stateme... FAILED tests/test_model.py::test_grep_all - sqlalchemy.exc.StatementError: (p... FAILED tests/test_model.py::test_grep_category - sqlalchemy.exc.StatementErro... FAILED tests/test_model.py::test_grep_not_category - sqlalchemy.exc.Statement... FAILED tests/test_model.py::test_add_headers - sqlalchemy.exc.StatementError:... FAILED tests/test_model.py::test_grep_topics - sqlalchemy.exc.StatementError:... FAILED tests/test_model.py::test_grep_not_topics - sqlalchemy.exc.StatementEr... FAILED tests/test_model.py::test_grep_start_end - sqlalchemy.exc.StatementErr... FAILED tests/test_model.py::test_grep_msg_id - sqlalchemy.exc.StatementError:... FAILED tests/test_model.py::test_grep_users - sqlalchemy.exc.StatementError: ... FAILED tests/test_model.py::test_grep_not_users - sqlalchemy.exc.StatementErr... FAILED tests/test_model.py::test_grep_packages - sqlalchemy.exc.StatementErro... FAILED tests/test_model.py::test_grep_not_packages - sqlalchemy.exc.Statement... FAILED tests/test_model.py::test_grep_contains - sqlalchemy.exc.StatementErro... FAILED tests/test_model.py::test_grep_rows_per_page_none - sqlalchemy.exc.Sta... FAILED tests/test_model.py::test_grep_rows_per_page_zero - sqlalchemy.exc.Sta... FAILED tests/test_model.py::test_grep_defer - sqlalchemy.exc.StatementError: ... FAILED tests/test_model.py::test_add_duplicate - sqlalchemy.exc.StatementErro... FAILED tests/test_model.py::test_add_duplicate_package - sqlalchemy.exc.State... FAILED tests/test_model.py::test_add_message_with_error_on_packages - sqlalch... FAILED tests/test_model.py::test_as_fedora_message_dict - sqlalchemy.exc.Stat... FAILED tests/test_model.py::test_as_fedora_message_dict_old_headers - sqlalch... FAILED tests/test_model.py::test_as_fedora_message_dict_no_headers - sqlalche... FAILED tests/test_model.py::test_as_dict - sqlalchemy.exc.StatementError: (pk... FAILED tests/test_model.py::test_as_dict_with_users_and_packages - sqlalchemy... FAILED tests/test_model.py::test___json__deprecated - sqlalchemy.exc.Statemen... ============= 34 failed, 14 passed, 1 warning in 534.90s (0:08:54) ============= RPM build errors: error: Bad exit status from /var/tmp/rpm-tmp.xv2SVi (%check) Bad exit status from /var/tmp/rpm-tmp.xv2SVi (%check) Child return code was: 1 EXCEPTION: [Error('Command failed: \n # bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec\n', 1)] Traceback (most recent call last): File "/usr/lib/python3.11/site-packages/mockbuild/trace_decorator.py", line 93, in trace result = func(*args, **kw) ^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/site-packages/mockbuild/util.py", line 598, in do_with_status raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode) mockbuild.exception.Error: Command failed: # bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-datanommer-models.spec