2026-01-22 16:13:41,353 p=30776 u=zuul n=ansible | Starting galaxy collection install process 2026-01-22 16:13:41,354 p=30776 u=zuul n=ansible | Process install dependency map 2026-01-22 16:13:56,931 p=30776 u=zuul n=ansible | Starting collection install process 2026-01-22 16:13:56,931 p=30776 u=zuul n=ansible | Installing 'cifmw.general:1.0.0+a1c87a94' to '/home/zuul/.ansible/collections/ansible_collections/cifmw/general' 2026-01-22 16:13:57,402 p=30776 u=zuul n=ansible | Created collection for cifmw.general:1.0.0+a1c87a94 at /home/zuul/.ansible/collections/ansible_collections/cifmw/general 2026-01-22 16:13:57,402 p=30776 u=zuul n=ansible | cifmw.general:1.0.0+a1c87a94 was installed successfully 2026-01-22 16:13:57,402 p=30776 u=zuul n=ansible | Installing 'containers.podman:1.16.2' to '/home/zuul/.ansible/collections/ansible_collections/containers/podman' 2026-01-22 16:13:57,457 p=30776 u=zuul n=ansible | Created collection for containers.podman:1.16.2 at /home/zuul/.ansible/collections/ansible_collections/containers/podman 2026-01-22 16:13:57,457 p=30776 u=zuul n=ansible | containers.podman:1.16.2 was installed successfully 2026-01-22 16:13:57,457 p=30776 u=zuul n=ansible | Installing 'community.general:10.0.1' to '/home/zuul/.ansible/collections/ansible_collections/community/general' 2026-01-22 16:13:58,160 p=30776 u=zuul n=ansible | Created collection for community.general:10.0.1 at /home/zuul/.ansible/collections/ansible_collections/community/general 2026-01-22 16:13:58,160 p=30776 u=zuul n=ansible | community.general:10.0.1 was installed successfully 2026-01-22 16:13:58,160 p=30776 u=zuul n=ansible | Installing 'ansible.posix:1.6.2' to '/home/zuul/.ansible/collections/ansible_collections/ansible/posix' 2026-01-22 16:13:58,209 p=30776 u=zuul n=ansible | Created collection for ansible.posix:1.6.2 at /home/zuul/.ansible/collections/ansible_collections/ansible/posix 2026-01-22 16:13:58,209 p=30776 u=zuul n=ansible | ansible.posix:1.6.2 was installed successfully 2026-01-22 16:13:58,209 p=30776 u=zuul n=ansible | Installing 'ansible.utils:5.1.2' to '/home/zuul/.ansible/collections/ansible_collections/ansible/utils' 2026-01-22 16:13:58,302 p=30776 u=zuul n=ansible | Created collection for ansible.utils:5.1.2 at /home/zuul/.ansible/collections/ansible_collections/ansible/utils 2026-01-22 16:13:58,302 p=30776 u=zuul n=ansible | ansible.utils:5.1.2 was installed successfully 2026-01-22 16:13:58,302 p=30776 u=zuul n=ansible | Installing 'community.libvirt:1.3.0' to '/home/zuul/.ansible/collections/ansible_collections/community/libvirt' 2026-01-22 16:13:58,325 p=30776 u=zuul n=ansible | Created collection for community.libvirt:1.3.0 at /home/zuul/.ansible/collections/ansible_collections/community/libvirt 2026-01-22 16:13:58,325 p=30776 u=zuul n=ansible | community.libvirt:1.3.0 was installed successfully 2026-01-22 16:13:58,325 p=30776 u=zuul n=ansible | Installing 'community.crypto:2.22.3' to '/home/zuul/.ansible/collections/ansible_collections/community/crypto' 2026-01-22 16:13:58,464 p=30776 u=zuul n=ansible | Created collection for community.crypto:2.22.3 at /home/zuul/.ansible/collections/ansible_collections/community/crypto 2026-01-22 16:13:58,464 p=30776 u=zuul n=ansible | community.crypto:2.22.3 was installed successfully 2026-01-22 16:13:58,464 p=30776 u=zuul n=ansible | Installing 'kubernetes.core:5.0.0' to '/home/zuul/.ansible/collections/ansible_collections/kubernetes/core' 2026-01-22 16:13:58,577 p=30776 u=zuul n=ansible | Created collection for kubernetes.core:5.0.0 at /home/zuul/.ansible/collections/ansible_collections/kubernetes/core 2026-01-22 16:13:58,577 p=30776 u=zuul n=ansible | kubernetes.core:5.0.0 was installed successfully 2026-01-22 16:13:58,577 p=30776 u=zuul n=ansible | Installing 'ansible.netcommon:7.1.0' to '/home/zuul/.ansible/collections/ansible_collections/ansible/netcommon' 2026-01-22 16:13:58,642 p=30776 u=zuul n=ansible | Created collection for ansible.netcommon:7.1.0 at /home/zuul/.ansible/collections/ansible_collections/ansible/netcommon 2026-01-22 16:13:58,642 p=30776 u=zuul n=ansible | ansible.netcommon:7.1.0 was installed successfully 2026-01-22 16:13:58,642 p=30776 u=zuul n=ansible | Installing 'openstack.config_template:2.1.1' to '/home/zuul/.ansible/collections/ansible_collections/openstack/config_template' 2026-01-22 16:13:58,660 p=30776 u=zuul n=ansible | Created collection for openstack.config_template:2.1.1 at /home/zuul/.ansible/collections/ansible_collections/openstack/config_template 2026-01-22 16:13:58,660 p=30776 u=zuul n=ansible | openstack.config_template:2.1.1 was installed successfully 2026-01-22 16:13:58,660 p=30776 u=zuul n=ansible | Installing 'junipernetworks.junos:9.1.0' to '/home/zuul/.ansible/collections/ansible_collections/junipernetworks/junos' 2026-01-22 16:13:58,880 p=30776 u=zuul n=ansible | Created collection for junipernetworks.junos:9.1.0 at /home/zuul/.ansible/collections/ansible_collections/junipernetworks/junos 2026-01-22 16:13:58,880 p=30776 u=zuul n=ansible | junipernetworks.junos:9.1.0 was installed successfully 2026-01-22 16:13:58,880 p=30776 u=zuul n=ansible | Installing 'cisco.ios:9.0.3' to '/home/zuul/.ansible/collections/ansible_collections/cisco/ios' 2026-01-22 16:13:59,139 p=30776 u=zuul n=ansible | Created collection for cisco.ios:9.0.3 at /home/zuul/.ansible/collections/ansible_collections/cisco/ios 2026-01-22 16:13:59,140 p=30776 u=zuul n=ansible | cisco.ios:9.0.3 was installed successfully 2026-01-22 16:13:59,140 p=30776 u=zuul n=ansible | Installing 'mellanox.onyx:1.0.0' to '/home/zuul/.ansible/collections/ansible_collections/mellanox/onyx' 2026-01-22 16:13:59,173 p=30776 u=zuul n=ansible | Created collection for mellanox.onyx:1.0.0 at /home/zuul/.ansible/collections/ansible_collections/mellanox/onyx 2026-01-22 16:13:59,173 p=30776 u=zuul n=ansible | mellanox.onyx:1.0.0 was installed successfully 2026-01-22 16:13:59,173 p=30776 u=zuul n=ansible | Installing 'community.okd:4.0.0' to '/home/zuul/.ansible/collections/ansible_collections/community/okd' 2026-01-22 16:13:59,201 p=30776 u=zuul n=ansible | Created collection for community.okd:4.0.0 at /home/zuul/.ansible/collections/ansible_collections/community/okd 2026-01-22 16:13:59,201 p=30776 u=zuul n=ansible | community.okd:4.0.0 was installed successfully 2026-01-22 16:13:59,202 p=30776 u=zuul n=ansible | Installing '@NAMESPACE@.@NAME@:3.1.4' to '/home/zuul/.ansible/collections/ansible_collections/@NAMESPACE@/@NAME@' 2026-01-22 16:13:59,286 p=30776 u=zuul n=ansible | Created collection for @NAMESPACE@.@NAME@:3.1.4 at /home/zuul/.ansible/collections/ansible_collections/@NAMESPACE@/@NAME@ 2026-01-22 16:13:59,286 p=30776 u=zuul n=ansible | @NAMESPACE@.@NAME@:3.1.4 was installed successfully 2026-01-22 16:14:09,915 p=31411 u=zuul n=ansible | PLAY [Remove status flag] ****************************************************** 2026-01-22 16:14:09,937 p=31411 u=zuul n=ansible | TASK [Gathering Facts ] ******************************************************** 2026-01-22 16:14:09,937 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:09 +0000 (0:00:00.041) 0:00:00.041 ****** 2026-01-22 16:14:09,937 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:09 +0000 (0:00:00.040) 0:00:00.040 ****** 2026-01-22 16:14:10,948 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:10,972 p=31411 u=zuul n=ansible | TASK [Delete success flag if exists path={{ ansible_user_dir }}/cifmw-success, state=absent] *** 2026-01-22 16:14:10,972 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:10 +0000 (0:00:01.034) 0:00:01.076 ****** 2026-01-22 16:14:10,972 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:10 +0000 (0:00:01.034) 0:00:01.075 ****** 2026-01-22 16:14:11,301 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:11,315 p=31411 u=zuul n=ansible | TASK [Inherit from parent scenarios if needed _raw_params=ci/playbooks/tasks/inherit_parent_scenario.yml] *** 2026-01-22 16:14:11,315 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.343) 0:00:01.419 ****** 2026-01-22 16:14:11,316 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.343) 0:00:01.418 ****** 2026-01-22 16:14:11,337 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/ci/playbooks/tasks/inherit_parent_scenario.yml for localhost 2026-01-22 16:14:11,380 p=31411 u=zuul n=ansible | TASK [Inherit from parent parameter file if instructed file={{ item }}] ******** 2026-01-22 16:14:11,380 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.064) 0:00:01.484 ****** 2026-01-22 16:14:11,380 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.064) 0:00:01.483 ****** 2026-01-22 16:14:11,420 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:11,426 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Set custom cifmw PATH reusable fact cifmw_path={{ ansible_user_dir }}/.crc/bin:{{ ansible_user_dir }}/.crc/bin/oc:{{ ansible_user_dir }}/bin:{{ ansible_env.PATH }}, cacheable=True] *** 2026-01-22 16:14:11,426 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.046) 0:00:01.530 ****** 2026-01-22 16:14:11,426 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.046) 0:00:01.529 ****** 2026-01-22 16:14:11,464 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:11,473 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Get customized parameters ci_framework_params={{ hostvars[inventory_hostname] | dict2items | selectattr("key", "match", "^(cifmw|pre|post)_(?!install_yamls|openshift_token|openshift_login|openshift_kubeconfig).*") | list | items2dict }}] *** 2026-01-22 16:14:11,473 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.046) 0:00:01.577 ****** 2026-01-22 16:14:11,473 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.046) 0:00:01.576 ****** 2026-01-22 16:14:11,556 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:11,565 p=31411 u=zuul n=ansible | TASK [install_ca : Ensure target directory exists path={{ cifmw_install_ca_trust_dir }}, state=directory, mode=0755] *** 2026-01-22 16:14:11,565 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.092) 0:00:01.669 ****** 2026-01-22 16:14:11,566 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.092) 0:00:01.668 ****** 2026-01-22 16:14:11,798 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:11,807 p=31411 u=zuul n=ansible | TASK [install_ca : Install internal CA from url url={{ cifmw_install_ca_url }}, dest={{ cifmw_install_ca_trust_dir }}, validate_certs={{ cifmw_install_ca_url_validate_certs | default(omit) }}, mode=0644] *** 2026-01-22 16:14:11,807 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.241) 0:00:01.911 ****** 2026-01-22 16:14:11,807 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.241) 0:00:01.910 ****** 2026-01-22 16:14:11,829 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:11,839 p=31411 u=zuul n=ansible | TASK [install_ca : Install custom CA bundle from inline dest={{ cifmw_install_ca_trust_dir }}/cifmw_inline_ca_bundle.crt, content={{ cifmw_install_ca_bundle_inline }}, mode=0644] *** 2026-01-22 16:14:11,839 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.032) 0:00:01.943 ****** 2026-01-22 16:14:11,839 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.032) 0:00:01.942 ****** 2026-01-22 16:14:11,860 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:11,869 p=31411 u=zuul n=ansible | TASK [install_ca : Install custom CA bundle from file dest={{ cifmw_install_ca_trust_dir }}/{{ cifmw_install_ca_bundle_src | basename }}, src={{ cifmw_install_ca_bundle_src }}, mode=0644] *** 2026-01-22 16:14:11,869 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.029) 0:00:01.973 ****** 2026-01-22 16:14:11,869 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.029) 0:00:01.971 ****** 2026-01-22 16:14:11,888 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:11,895 p=31411 u=zuul n=ansible | TASK [install_ca : Update ca bundle _raw_params=update-ca-trust] *************** 2026-01-22 16:14:11,895 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.025) 0:00:01.999 ****** 2026-01-22 16:14:11,895 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:11 +0000 (0:00:00.025) 0:00:01.997 ****** 2026-01-22 16:14:13,284 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:13,295 p=31411 u=zuul n=ansible | TASK [repo_setup : Ensure directories are present path={{ cifmw_repo_setup_basedir }}/{{ item }}, state=directory, mode=0755] *** 2026-01-22 16:14:13,295 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:13 +0000 (0:00:01.400) 0:00:03.399 ****** 2026-01-22 16:14:13,295 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:13 +0000 (0:00:01.400) 0:00:03.398 ****** 2026-01-22 16:14:13,491 p=31411 u=zuul n=ansible | changed: [localhost] => (item=tmp) 2026-01-22 16:14:13,647 p=31411 u=zuul n=ansible | changed: [localhost] => (item=artifacts/repositories) 2026-01-22 16:14:13,810 p=31411 u=zuul n=ansible | changed: [localhost] => (item=venv/repo_setup) 2026-01-22 16:14:13,818 p=31411 u=zuul n=ansible | TASK [repo_setup : Make sure git-core package is installed name=git-core, state=present] *** 2026-01-22 16:14:13,819 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:13 +0000 (0:00:00.523) 0:00:03.922 ****** 2026-01-22 16:14:13,819 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:13 +0000 (0:00:00.523) 0:00:03.921 ****** 2026-01-22 16:14:14,860 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:14,867 p=31411 u=zuul n=ansible | TASK [repo_setup : Get repo-setup repository accept_hostkey=True, dest={{ cifmw_repo_setup_basedir }}/tmp/repo-setup, repo={{ cifmw_repo_setup_src }}] *** 2026-01-22 16:14:14,868 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:14 +0000 (0:00:01.048) 0:00:04.971 ****** 2026-01-22 16:14:14,868 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:14 +0000 (0:00:01.048) 0:00:04.970 ****** 2026-01-22 16:14:16,141 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:16,146 p=31411 u=zuul n=ansible | TASK [repo_setup : Initialize python venv and install requirements virtualenv={{ cifmw_repo_setup_venv }}, requirements={{ cifmw_repo_setup_basedir }}/tmp/repo-setup/requirements.txt, virtualenv_command=python3 -m venv --system-site-packages --upgrade-deps] *** 2026-01-22 16:14:16,147 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:16 +0000 (0:00:01.279) 0:00:06.250 ****** 2026-01-22 16:14:16,147 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:16 +0000 (0:00:01.279) 0:00:06.249 ****** 2026-01-22 16:14:24,975 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:24,981 p=31411 u=zuul n=ansible | TASK [repo_setup : Install repo-setup package chdir={{ cifmw_repo_setup_basedir }}/tmp/repo-setup, creates={{ cifmw_repo_setup_venv }}/bin/repo-setup, _raw_params={{ cifmw_repo_setup_venv }}/bin/python setup.py install] *** 2026-01-22 16:14:24,981 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:24 +0000 (0:00:08.834) 0:00:15.085 ****** 2026-01-22 16:14:24,981 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:24 +0000 (0:00:08.834) 0:00:15.084 ****** 2026-01-22 16:14:25,736 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:25,744 p=31411 u=zuul n=ansible | TASK [repo_setup : Set cifmw_repo_setup_dlrn_hash_tag from content provider cifmw_repo_setup_dlrn_hash_tag={{ content_provider_dlrn_md5_hash }}] *** 2026-01-22 16:14:25,744 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:25 +0000 (0:00:00.762) 0:00:15.848 ****** 2026-01-22 16:14:25,744 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:25 +0000 (0:00:00.762) 0:00:15.847 ****** 2026-01-22 16:14:25,765 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:25,772 p=31411 u=zuul n=ansible | TASK [repo_setup : Run repo-setup _raw_params={{ cifmw_repo_setup_venv }}/bin/repo-setup {{ cifmw_repo_setup_promotion }} {{ cifmw_repo_setup_additional_repos }} -d {{ cifmw_repo_setup_os_release }}{{ cifmw_repo_setup_dist_major_version }} -b {{ cifmw_repo_setup_branch }} --rdo-mirror {{ cifmw_repo_setup_rdo_mirror }} {% if cifmw_repo_setup_dlrn_hash_tag | length > 0 %} --dlrn-hash-tag {{ cifmw_repo_setup_dlrn_hash_tag }} {% endif %} -o {{ cifmw_repo_setup_output }}] *** 2026-01-22 16:14:25,772 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:25 +0000 (0:00:00.027) 0:00:15.876 ****** 2026-01-22 16:14:25,772 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:25 +0000 (0:00:00.027) 0:00:15.874 ****** 2026-01-22 16:14:26,447 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:26,454 p=31411 u=zuul n=ansible | TASK [repo_setup : Get component repo url={{ cifmw_repo_setup_dlrn_uri }}/{{ cifmw_repo_setup_os_release }}{{ cifmw_repo_setup_dist_major_version }}-{{ cifmw_repo_setup_branch }}/component/{{ cifmw_repo_setup_component_name }}/{{ cifmw_repo_setup_component_promotion_tag }}/delorean.repo, dest={{ cifmw_repo_setup_output }}/{{ cifmw_repo_setup_component_name }}_{{ cifmw_repo_setup_component_promotion_tag }}_delorean.repo, mode=0644] *** 2026-01-22 16:14:26,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.683) 0:00:16.559 ****** 2026-01-22 16:14:26,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.683) 0:00:16.557 ****** 2026-01-22 16:14:26,484 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:26,490 p=31411 u=zuul n=ansible | TASK [repo_setup : Rename component repo path={{ cifmw_repo_setup_output }}/{{ cifmw_repo_setup_component_name }}_{{ cifmw_repo_setup_component_promotion_tag }}_delorean.repo, regexp=delorean-component-{{ cifmw_repo_setup_component_name }}, replace={{ cifmw_repo_setup_component_name }}-{{ cifmw_repo_setup_component_promotion_tag }}] *** 2026-01-22 16:14:26,490 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.035) 0:00:16.594 ****** 2026-01-22 16:14:26,490 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.035) 0:00:16.593 ****** 2026-01-22 16:14:26,519 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:26,526 p=31411 u=zuul n=ansible | TASK [repo_setup : Disable component repo in current-podified dlrn repo path={{ cifmw_repo_setup_output }}/delorean.repo, section=delorean-component-{{ cifmw_repo_setup_component_name }}, option=enabled, value=0, mode=0644] *** 2026-01-22 16:14:26,526 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.035) 0:00:16.630 ****** 2026-01-22 16:14:26,526 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.035) 0:00:16.629 ****** 2026-01-22 16:14:26,556 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:26,564 p=31411 u=zuul n=ansible | TASK [repo_setup : Run repo-setup-get-hash _raw_params={{ cifmw_repo_setup_venv }}/bin/repo-setup-get-hash --dlrn-url {{ cifmw_repo_setup_dlrn_uri[:-1] }} --os-version {{ cifmw_repo_setup_os_release }}{{ cifmw_repo_setup_dist_major_version }} --release {{ cifmw_repo_setup_branch }} {% if cifmw_repo_setup_component_name | length > 0 -%} --component {{ cifmw_repo_setup_component_name }} --tag {{ cifmw_repo_setup_component_promotion_tag }} {% else -%} --tag {{cifmw_repo_setup_promotion }} {% endif -%} {% if (cifmw_repo_setup_dlrn_hash_tag | length > 0) and (cifmw_repo_setup_component_name | length <= 0) -%} --dlrn-hash-tag {{ cifmw_repo_setup_dlrn_hash_tag }} {% endif -%} --json] *** 2026-01-22 16:14:26,564 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.037) 0:00:16.668 ****** 2026-01-22 16:14:26,564 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:26 +0000 (0:00:00.037) 0:00:16.666 ****** 2026-01-22 16:14:27,031 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:27,038 p=31411 u=zuul n=ansible | TASK [repo_setup : Dump full hash in delorean.repo.md5 file content={{ _repo_setup_json['full_hash'] }} , dest={{ cifmw_repo_setup_basedir }}/artifacts/repositories/delorean.repo.md5, mode=0644] *** 2026-01-22 16:14:27,038 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.473) 0:00:17.142 ****** 2026-01-22 16:14:27,038 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.473) 0:00:17.140 ****** 2026-01-22 16:14:27,662 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:27,667 p=31411 u=zuul n=ansible | TASK [repo_setup : Dump current-podified hash url={{ cifmw_repo_setup_dlrn_uri }}/{{ cifmw_repo_setup_os_release }}{{ cifmw_repo_setup_dist_major_version }}-{{ cifmw_repo_setup_branch }}/current-podified/delorean.repo.md5, dest={{ cifmw_repo_setup_basedir }}/artifacts/repositories/delorean.repo.md5, mode=0644] *** 2026-01-22 16:14:27,668 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.629) 0:00:17.771 ****** 2026-01-22 16:14:27,668 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.629) 0:00:17.770 ****** 2026-01-22 16:14:27,680 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,686 p=31411 u=zuul n=ansible | TASK [repo_setup : Slurp current podified hash src={{ cifmw_repo_setup_basedir }}/artifacts/repositories/delorean.repo.md5] *** 2026-01-22 16:14:27,686 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.790 ****** 2026-01-22 16:14:27,686 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.789 ****** 2026-01-22 16:14:27,699 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,705 p=31411 u=zuul n=ansible | TASK [repo_setup : Update the value of full_hash _repo_setup_json={{ _repo_setup_json | combine({'full_hash': _hash}, recursive=true) }}] *** 2026-01-22 16:14:27,705 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.809 ****** 2026-01-22 16:14:27,705 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.807 ****** 2026-01-22 16:14:27,717 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,723 p=31411 u=zuul n=ansible | TASK [repo_setup : Export hashes facts for further use cifmw_repo_setup_full_hash={{ _repo_setup_json['full_hash'] }}, cifmw_repo_setup_commit_hash={{ _repo_setup_json['commit_hash'] }}, cifmw_repo_setup_distro_hash={{ _repo_setup_json['distro_hash'] }}, cifmw_repo_setup_extended_hash={{ _repo_setup_json['extended_hash'] }}, cifmw_repo_setup_dlrn_api_url={{ _repo_setup_json['dlrn_api_url'] }}, cifmw_repo_setup_dlrn_url={{ _repo_setup_json['dlrn_url'] }}, cifmw_repo_setup_release={{ _repo_setup_json['release'] }}, cacheable=True] *** 2026-01-22 16:14:27,723 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.827 ****** 2026-01-22 16:14:27,723 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.018) 0:00:17.825 ****** 2026-01-22 16:14:27,744 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:27,750 p=31411 u=zuul n=ansible | TASK [repo_setup : Create download directory path={{ cifmw_repo_setup_rhos_release_path }}, state=directory, mode=0755] *** 2026-01-22 16:14:27,750 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.027) 0:00:17.854 ****** 2026-01-22 16:14:27,750 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.027) 0:00:17.852 ****** 2026-01-22 16:14:27,761 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,768 p=31411 u=zuul n=ansible | TASK [repo_setup : Print the URL to request msg={{ cifmw_repo_setup_rhos_release_rpm }}] *** 2026-01-22 16:14:27,768 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.872 ****** 2026-01-22 16:14:27,768 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.870 ****** 2026-01-22 16:14:27,779 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,785 p=31411 u=zuul n=ansible | TASK [Download the RPM name=krb_request] *************************************** 2026-01-22 16:14:27,785 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.889 ****** 2026-01-22 16:14:27,785 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.888 ****** 2026-01-22 16:14:27,799 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,804 p=31411 u=zuul n=ansible | TASK [repo_setup : Install RHOS Release tool name={{ cifmw_repo_setup_rhos_release_rpm if cifmw_repo_setup_rhos_release_rpm is not url else cifmw_krb_request_out.path }}, state=present, disable_gpg_check={{ cifmw_repo_setup_rhos_release_gpg_check | bool }}] *** 2026-01-22 16:14:27,804 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.019) 0:00:17.908 ****** 2026-01-22 16:14:27,804 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.019) 0:00:17.907 ****** 2026-01-22 16:14:27,816 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,822 p=31411 u=zuul n=ansible | TASK [repo_setup : Get rhos-release tool version _raw_params=rhos-release --version] *** 2026-01-22 16:14:27,822 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.926 ****** 2026-01-22 16:14:27,822 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.925 ****** 2026-01-22 16:14:27,836 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,843 p=31411 u=zuul n=ansible | TASK [repo_setup : Print rhos-release tool version msg={{ rr_version.stdout }}] *** 2026-01-22 16:14:27,843 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.020) 0:00:17.947 ****** 2026-01-22 16:14:27,843 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.020) 0:00:17.946 ****** 2026-01-22 16:14:27,854 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,861 p=31411 u=zuul n=ansible | TASK [repo_setup : Generate repos using rhos-release {{ cifmw_repo_setup_rhos_release_args }} _raw_params=rhos-release {{ cifmw_repo_setup_rhos_release_args }} \ -t {{ cifmw_repo_setup_output }}] *** 2026-01-22 16:14:27,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.965 ****** 2026-01-22 16:14:27,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.017) 0:00:17.963 ****** 2026-01-22 16:14:27,871 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:27,877 p=31411 u=zuul n=ansible | TASK [repo_setup : Check for /etc/ci/mirror_info.sh path=/etc/ci/mirror_info.sh] *** 2026-01-22 16:14:27,877 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.016) 0:00:17.981 ****** 2026-01-22 16:14:27,877 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:27 +0000 (0:00:00.016) 0:00:17.980 ****** 2026-01-22 16:14:28,050 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:28,061 p=31411 u=zuul n=ansible | TASK [repo_setup : Use RDO proxy mirrors chdir={{ cifmw_repo_setup_output }}, _raw_params=set -o pipefail source /etc/ci/mirror_info.sh sed -i -e "s|https://trunk.rdoproject.org|$NODEPOOL_RDO_PROXY|g" *.repo ] *** 2026-01-22 16:14:28,061 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.183) 0:00:18.165 ****** 2026-01-22 16:14:28,061 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.183) 0:00:18.163 ****** 2026-01-22 16:14:28,277 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:28,282 p=31411 u=zuul n=ansible | TASK [repo_setup : Use RDO CentOS mirrors (remove CentOS 10 conditional when Nodepool mirrors exist) chdir={{ cifmw_repo_setup_output }}, _raw_params=set -o pipefail source /etc/ci/mirror_info.sh sed -i -e "s|http://mirror.stream.centos.org|$NODEPOOL_CENTOS_MIRROR|g" *.repo ] *** 2026-01-22 16:14:28,283 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.221) 0:00:18.387 ****** 2026-01-22 16:14:28,283 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.221) 0:00:18.385 ****** 2026-01-22 16:14:28,473 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:28,479 p=31411 u=zuul n=ansible | TASK [repo_setup : Check for gating.repo file on content provider url=http://{{ content_provider_registry_ip }}:8766/gating.repo] *** 2026-01-22 16:14:28,479 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.196) 0:00:18.583 ****** 2026-01-22 16:14:28,479 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:28 +0000 (0:00:00.196) 0:00:18.582 ****** 2026-01-22 16:14:29,023 p=31411 u=zuul n=ansible | fatal: [localhost]: FAILED! => changed: false elapsed: 0 msg: 'Status code was -1 and not [200]: Request failed: ' redirected: false status: -1 url: http://38.102.83.113:8766/gating.repo 2026-01-22 16:14:29,023 p=31411 u=zuul n=ansible | ...ignoring 2026-01-22 16:14:29,031 p=31411 u=zuul n=ansible | TASK [repo_setup : Populate gating repo from content provider ip content=[gating-repo] baseurl=http://{{ content_provider_registry_ip }}:8766/ enabled=1 gpgcheck=0 priority=1 , dest={{ cifmw_repo_setup_output }}/gating.repo, mode=0644] *** 2026-01-22 16:14:29,031 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.551) 0:00:19.135 ****** 2026-01-22 16:14:29,031 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.551) 0:00:19.133 ****** 2026-01-22 16:14:29,053 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:29,060 p=31411 u=zuul n=ansible | TASK [repo_setup : Check for DLRN repo at the destination path={{ cifmw_repo_setup_output }}/delorean.repo] *** 2026-01-22 16:14:29,060 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.029) 0:00:19.164 ****** 2026-01-22 16:14:29,060 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.029) 0:00:19.163 ****** 2026-01-22 16:14:29,083 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:29,090 p=31411 u=zuul n=ansible | TASK [repo_setup : Lower the priority of DLRN repos to allow installation from gating repo path={{ cifmw_repo_setup_output }}/delorean.repo, regexp=priority=1, replace=priority=20] *** 2026-01-22 16:14:29,090 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.030) 0:00:19.194 ****** 2026-01-22 16:14:29,090 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.030) 0:00:19.193 ****** 2026-01-22 16:14:29,113 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:29,119 p=31411 u=zuul n=ansible | TASK [repo_setup : Check for DLRN component repo path={{ cifmw_repo_setup_output }}/{{ _comp_repo }}] *** 2026-01-22 16:14:29,119 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.029) 0:00:19.223 ****** 2026-01-22 16:14:29,119 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.029) 0:00:19.222 ****** 2026-01-22 16:14:29,141 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:29,148 p=31411 u=zuul n=ansible | TASK [repo_setup : Lower the priority of componennt repos to allow installation from gating repo path={{ cifmw_repo_setup_output }}//{{ _comp_repo }}, regexp=priority=1, replace=priority=2] *** 2026-01-22 16:14:29,148 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.028) 0:00:19.252 ****** 2026-01-22 16:14:29,148 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.028) 0:00:19.250 ****** 2026-01-22 16:14:29,170 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:14:29,176 p=31411 u=zuul n=ansible | TASK [repo_setup : Find existing repos from /etc/yum.repos.d directory paths=/etc/yum.repos.d/, patterns=*.repo, recurse=False] *** 2026-01-22 16:14:29,176 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.028) 0:00:19.280 ****** 2026-01-22 16:14:29,176 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.028) 0:00:19.279 ****** 2026-01-22 16:14:29,473 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:29,481 p=31411 u=zuul n=ansible | TASK [repo_setup : Remove existing repos from /etc/yum.repos.d directory path={{ item }}, state=absent] *** 2026-01-22 16:14:29,481 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.304) 0:00:19.585 ****** 2026-01-22 16:14:29,481 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.304) 0:00:19.583 ****** 2026-01-22 16:14:29,690 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/etc/yum.repos.d/centos-addons.repo) 2026-01-22 16:14:29,865 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/etc/yum.repos.d/centos.repo) 2026-01-22 16:14:29,876 p=31411 u=zuul n=ansible | TASK [repo_setup : Cleanup existing metadata _raw_params=dnf clean metadata] *** 2026-01-22 16:14:29,876 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.395) 0:00:19.980 ****** 2026-01-22 16:14:29,876 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:29 +0000 (0:00:00.395) 0:00:19.979 ****** 2026-01-22 16:14:30,312 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:30,318 p=31411 u=zuul n=ansible | TASK [repo_setup : Copy generated repos to /etc/yum.repos.d directory mode=0755, remote_src=True, src={{ cifmw_repo_setup_output }}/, dest=/etc/yum.repos.d] *** 2026-01-22 16:14:30,318 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.441) 0:00:20.422 ****** 2026-01-22 16:14:30,318 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.441) 0:00:20.421 ****** 2026-01-22 16:14:30,551 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:30,561 p=31411 u=zuul n=ansible | TASK [ci_setup : Gather variables for each operating system _raw_params={{ item }}] *** 2026-01-22 16:14:30,561 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.242) 0:00:20.665 ****** 2026-01-22 16:14:30,561 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.242) 0:00:20.664 ****** 2026-01-22 16:14:30,591 p=31411 u=zuul n=ansible | ok: [localhost] => (item=/home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/ci_setup/vars/redhat.yml) 2026-01-22 16:14:30,598 p=31411 u=zuul n=ansible | TASK [ci_setup : List packages to install var=cifmw_ci_setup_packages] ********* 2026-01-22 16:14:30,598 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.037) 0:00:20.702 ****** 2026-01-22 16:14:30,598 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.037) 0:00:20.701 ****** 2026-01-22 16:14:30,613 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_ci_setup_packages: - bash-completion - ca-certificates - git-core - make - tar - tmux - python3-pip 2026-01-22 16:14:30,619 p=31411 u=zuul n=ansible | TASK [ci_setup : Install needed packages name={{ cifmw_ci_setup_packages }}, state=latest] *** 2026-01-22 16:14:30,619 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.020) 0:00:20.723 ****** 2026-01-22 16:14:30,619 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:30 +0000 (0:00:00.020) 0:00:20.721 ****** 2026-01-22 16:14:58,837 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:58,844 p=31411 u=zuul n=ansible | TASK [ci_setup : Gather version of openshift client _raw_params=oc version --client -o yaml] *** 2026-01-22 16:14:58,844 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:58 +0000 (0:00:28.225) 0:00:48.948 ****** 2026-01-22 16:14:58,844 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:58 +0000 (0:00:28.225) 0:00:48.947 ****** 2026-01-22 16:14:59,030 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:14:59,036 p=31411 u=zuul n=ansible | TASK [ci_setup : Ensure openshift client install path is present path={{ cifmw_ci_setup_oc_install_path }}, state=directory, mode=0755] *** 2026-01-22 16:14:59,036 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:59 +0000 (0:00:00.191) 0:00:49.140 ****** 2026-01-22 16:14:59,036 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:59 +0000 (0:00:00.191) 0:00:49.139 ****** 2026-01-22 16:14:59,247 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:14:59,257 p=31411 u=zuul n=ansible | TASK [ci_setup : Install openshift client src={{ cifmw_ci_setup_openshift_client_download_uri }}/{{ cifmw_ci_setup_openshift_client_version }}/openshift-client-linux.tar.gz, dest={{ cifmw_ci_setup_oc_install_path }}, remote_src=True, mode=0755, creates={{ cifmw_ci_setup_oc_install_path }}/oc] *** 2026-01-22 16:14:59,257 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:59 +0000 (0:00:00.220) 0:00:49.361 ****** 2026-01-22 16:14:59,257 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:14:59 +0000 (0:00:00.220) 0:00:49.359 ****** 2026-01-22 16:15:04,508 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:04,521 p=31411 u=zuul n=ansible | TASK [ci_setup : Add the OC path to cifmw_path if needed cifmw_path={{ cifmw_ci_setup_oc_install_path }}:{{ ansible_env.PATH }}, cacheable=True] *** 2026-01-22 16:15:04,521 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:05.264) 0:00:54.625 ****** 2026-01-22 16:15:04,521 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:05.264) 0:00:54.623 ****** 2026-01-22 16:15:04,545 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:04,558 p=31411 u=zuul n=ansible | TASK [ci_setup : Create completion file] *************************************** 2026-01-22 16:15:04,558 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:00.036) 0:00:54.662 ****** 2026-01-22 16:15:04,558 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:00.036) 0:00:54.660 ****** 2026-01-22 16:15:04,860 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:04,868 p=31411 u=zuul n=ansible | TASK [ci_setup : Source completion from within .bashrc create=True, mode=0644, path={{ ansible_user_dir }}/.bashrc, block=if [ -f ~/.oc_completion ]; then source ~/.oc_completion fi] *** 2026-01-22 16:15:04,868 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:00.310) 0:00:54.972 ****** 2026-01-22 16:15:04,868 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:04 +0000 (0:00:00.310) 0:00:54.971 ****** 2026-01-22 16:15:05,173 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:05,179 p=31411 u=zuul n=ansible | TASK [ci_setup : Check rhsm status _raw_params=subscription-manager status] **** 2026-01-22 16:15:05,179 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.311) 0:00:55.283 ****** 2026-01-22 16:15:05,179 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.311) 0:00:55.282 ****** 2026-01-22 16:15:05,194 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,201 p=31411 u=zuul n=ansible | TASK [ci_setup : Gather the repos to be enabled _repos={{ cifmw_ci_setup_rhel_rhsm_default_repos + (cifmw_ci_setup_rhel_rhsm_extra_repos | default([])) }}] *** 2026-01-22 16:15:05,201 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.305 ****** 2026-01-22 16:15:05,201 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.304 ****** 2026-01-22 16:15:05,214 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,221 p=31411 u=zuul n=ansible | TASK [ci_setup : Enabling the required repositories. name={{ item }}, state={{ rhsm_repo_state | default('enabled') }}] *** 2026-01-22 16:15:05,221 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.019) 0:00:55.325 ****** 2026-01-22 16:15:05,221 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.019) 0:00:55.324 ****** 2026-01-22 16:15:05,236 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,243 p=31411 u=zuul n=ansible | TASK [ci_setup : Get current /etc/redhat-release _raw_params=cat /etc/redhat-release] *** 2026-01-22 16:15:05,244 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.347 ****** 2026-01-22 16:15:05,244 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.346 ****** 2026-01-22 16:15:05,258 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,266 p=31411 u=zuul n=ansible | TASK [ci_setup : Print current /etc/redhat-release msg={{ _current_rh_release.stdout }}] *** 2026-01-22 16:15:05,266 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.370 ****** 2026-01-22 16:15:05,266 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.368 ****** 2026-01-22 16:15:05,280 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,288 p=31411 u=zuul n=ansible | TASK [ci_setup : Ensure the repos are enabled in the system using yum name={{ item.name }}, baseurl={{ item.baseurl }}, description={{ item.description | default(item.name) }}, gpgcheck={{ item.gpgcheck | default(false) }}, enabled=True, state={{ yum_repo_state | default('present') }}] *** 2026-01-22 16:15:05,288 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.392 ****** 2026-01-22 16:15:05,288 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.022) 0:00:55.391 ****** 2026-01-22 16:15:05,307 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:05,314 p=31411 u=zuul n=ansible | TASK [ci_setup : Manage directories path={{ item }}, state={{ directory_state }}, mode=0755, owner={{ ansible_user_id }}, group={{ ansible_user_id }}] *** 2026-01-22 16:15:05,314 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.025) 0:00:55.418 ****** 2026-01-22 16:15:05,314 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:05 +0000 (0:00:00.025) 0:00:55.416 ****** 2026-01-22 16:15:05,530 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/manifests/openstack/cr) 2026-01-22 16:15:05,750 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/home/zuul/ci-framework-data/logs) 2026-01-22 16:15:05,903 p=31411 u=zuul n=ansible | ok: [localhost] => (item=/home/zuul/ci-framework-data/tmp) 2026-01-22 16:15:06,184 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/home/zuul/ci-framework-data/volumes) 2026-01-22 16:15:06,373 p=31411 u=zuul n=ansible | ok: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/parameters) 2026-01-22 16:15:06,385 p=31411 u=zuul n=ansible | TASK [Prepare install_yamls make targets name=install_yamls, apply={'tags': ['bootstrap']}] *** 2026-01-22 16:15:06,385 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:06 +0000 (0:00:01.071) 0:00:56.489 ****** 2026-01-22 16:15:06,385 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:06 +0000 (0:00:01.071) 0:00:56.488 ****** 2026-01-22 16:15:06,501 p=31411 u=zuul n=ansible | TASK [install_yamls : Ensure directories exist path={{ item }}, state=directory, mode=0755] *** 2026-01-22 16:15:06,501 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:06 +0000 (0:00:00.115) 0:00:56.605 ****** 2026-01-22 16:15:06,501 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:06 +0000 (0:00:00.115) 0:00:56.603 ****** 2026-01-22 16:15:06,673 p=31411 u=zuul n=ansible | ok: [localhost] => (item=/home/zuul/ci-framework-data/artifacts) 2026-01-22 16:15:06,823 p=31411 u=zuul n=ansible | changed: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/roles/install_yamls_makes/tasks) 2026-01-22 16:15:07,013 p=31411 u=zuul n=ansible | ok: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/parameters) 2026-01-22 16:15:07,020 p=31411 u=zuul n=ansible | TASK [Create variables with local repos based on Zuul items name=install_yamls, tasks_from=zuul_set_operators_repo.yml] *** 2026-01-22 16:15:07,020 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.519) 0:00:57.124 ****** 2026-01-22 16:15:07,020 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.519) 0:00:57.123 ****** 2026-01-22 16:15:07,050 p=31411 u=zuul n=ansible | TASK [install_yamls : Set fact with local repos based on Zuul items cifmw_install_yamls_operators_repo={{ cifmw_install_yamls_operators_repo | default({}) | combine(_repo_operator_info | items2dict) }}] *** 2026-01-22 16:15:07,050 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.030) 0:00:57.154 ****** 2026-01-22 16:15:07,051 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.030) 0:00:57.153 ****** 2026-01-22 16:15:07,091 p=31411 u=zuul n=ansible | ok: [localhost] => (item={'branch': 'main', 'change': '577', 'change_url': 'https://github.com/openstack-k8s-operators/neutron-operator/pull/577', 'commit_id': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'patchset': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'project': {'canonical_hostname': 'github.com', 'canonical_name': 'github.com/openstack-k8s-operators/neutron-operator', 'name': 'openstack-k8s-operators/neutron-operator', 'short_name': 'neutron-operator', 'src_dir': 'src/github.com/openstack-k8s-operators/neutron-operator'}, 'topic': None}) 2026-01-22 16:15:07,098 p=31411 u=zuul n=ansible | TASK [install_yamls : Print helpful data for debugging msg=_repo_operator_name: {{ _repo_operator_name }} _repo_operator_info: {{ _repo_operator_info }} cifmw_install_yamls_operators_repo: {{ cifmw_install_yamls_operators_repo }} ] *** 2026-01-22 16:15:07,099 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.048) 0:00:57.202 ****** 2026-01-22 16:15:07,099 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.048) 0:00:57.201 ****** 2026-01-22 16:15:07,139 p=31411 u=zuul n=ansible | ok: [localhost] => (item={'branch': 'main', 'change': '577', 'change_url': 'https://github.com/openstack-k8s-operators/neutron-operator/pull/577', 'commit_id': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'patchset': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'project': {'canonical_hostname': 'github.com', 'canonical_name': 'github.com/openstack-k8s-operators/neutron-operator', 'name': 'openstack-k8s-operators/neutron-operator', 'short_name': 'neutron-operator', 'src_dir': 'src/github.com/openstack-k8s-operators/neutron-operator'}, 'topic': None}) => msg: | _repo_operator_name: neutron _repo_operator_info: [{'key': 'NEUTRON_REPO', 'value': '/home/zuul/src/github.com/openstack-k8s-operators/neutron-operator'}, {'key': 'NEUTRON_BRANCH', 'value': ''}] cifmw_install_yamls_operators_repo: {'NEUTRON_REPO': '/home/zuul/src/github.com/openstack-k8s-operators/neutron-operator', 'NEUTRON_BRANCH': ''} 2026-01-22 16:15:07,151 p=31411 u=zuul n=ansible | TASK [Customize install_yamls devsetup vars if needed name=install_yamls, tasks_from=customize_devsetup_vars.yml] *** 2026-01-22 16:15:07,151 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.052) 0:00:57.255 ****** 2026-01-22 16:15:07,151 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.052) 0:00:57.254 ****** 2026-01-22 16:15:07,191 p=31411 u=zuul n=ansible | TASK [install_yamls : Update opm_version in install_yamls devsetup/vars/default.yaml path={{ cifmw_install_yamls_repo }}/devsetup/vars/default.yaml, regexp=^opm_version:, line=opm_version: {{ cifmw_install_yamls_opm_version }}, state=present] *** 2026-01-22 16:15:07,191 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.040) 0:00:57.295 ****** 2026-01-22 16:15:07,191 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.040) 0:00:57.294 ****** 2026-01-22 16:15:07,217 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:07,225 p=31411 u=zuul n=ansible | TASK [install_yamls : Update sdk_version in install_yamls devsetup/vars/default.yaml path={{ cifmw_install_yamls_repo }}/devsetup/vars/default.yaml, regexp=^sdk_version:, line=sdk_version: {{ cifmw_install_yamls_sdk_version }}, state=present] *** 2026-01-22 16:15:07,226 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.034) 0:00:57.330 ****** 2026-01-22 16:15:07,226 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.034) 0:00:57.328 ****** 2026-01-22 16:15:07,248 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:07,254 p=31411 u=zuul n=ansible | TASK [install_yamls : Update go_version in install_yamls devsetup/vars/default.yaml path={{ cifmw_install_yamls_repo }}/devsetup/vars/default.yaml, regexp=^go_version:, line=go_version: {{ cifmw_install_yamls_go_version }}, state=present] *** 2026-01-22 16:15:07,254 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.027) 0:00:57.358 ****** 2026-01-22 16:15:07,254 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.027) 0:00:57.356 ****** 2026-01-22 16:15:07,275 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:07,281 p=31411 u=zuul n=ansible | TASK [install_yamls : Update kustomize_version in install_yamls devsetup/vars/default.yaml path={{ cifmw_install_yamls_repo }}/devsetup/vars/default.yaml, regexp=^kustomize_version:, line=kustomize_version: {{ cifmw_install_yamls_kustomize_version }}, state=present] *** 2026-01-22 16:15:07,281 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.027) 0:00:57.385 ****** 2026-01-22 16:15:07,281 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.027) 0:00:57.384 ****** 2026-01-22 16:15:07,305 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:07,316 p=31411 u=zuul n=ansible | TASK [install_yamls : Compute the cifmw_install_yamls_vars final value _install_yamls_override_vars={{ _install_yamls_override_vars | default({}) | combine(item, recursive=True) }}] *** 2026-01-22 16:15:07,316 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.034) 0:00:57.420 ****** 2026-01-22 16:15:07,316 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.034) 0:00:57.418 ****** 2026-01-22 16:15:07,375 p=31411 u=zuul n=ansible | ok: [localhost] => (item={'BMO_SETUP': False, 'INSTALL_CERT_MANAGER': False}) 2026-01-22 16:15:07,382 p=31411 u=zuul n=ansible | TASK [install_yamls : Set environment override cifmw_install_yamls_environment fact cifmw_install_yamls_environment={{ _install_yamls_override_vars.keys() | map('upper') | zip(_install_yamls_override_vars.values()) | items2dict(key_name=0, value_name=1) | combine({ 'OUT': cifmw_install_yamls_manifests_dir, 'OUTPUT_DIR': cifmw_install_yamls_edpm_dir, 'CHECKOUT_FROM_OPENSTACK_REF': cifmw_install_yamls_checkout_openstack_ref, 'OPENSTACK_K8S_BRANCH': (zuul is defined and not zuul.branch |regex_search('master|antelope|rhos')) | ternary(zuul.branch, 'main') }) | combine(install_yamls_operators_repos) }}, cacheable=True] *** 2026-01-22 16:15:07,383 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.066) 0:00:57.486 ****** 2026-01-22 16:15:07,383 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.066) 0:00:57.485 ****** 2026-01-22 16:15:07,417 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:07,423 p=31411 u=zuul n=ansible | TASK [install_yamls : Get environment structure base_path={{ cifmw_install_yamls_repo }}] *** 2026-01-22 16:15:07,423 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.040) 0:00:57.527 ****** 2026-01-22 16:15:07,423 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.040) 0:00:57.526 ****** 2026-01-22 16:15:07,950 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:07,958 p=31411 u=zuul n=ansible | TASK [install_yamls : Ensure Output directory exists path={{ cifmw_install_yamls_out_dir }}, state=directory, mode=0755] *** 2026-01-22 16:15:07,958 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.534) 0:00:58.062 ****** 2026-01-22 16:15:07,958 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:07 +0000 (0:00:00.534) 0:00:58.061 ****** 2026-01-22 16:15:08,148 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:08,162 p=31411 u=zuul n=ansible | TASK [install_yamls : Ensure user cifmw_install_yamls_vars contains existing Makefile variables that=_cifmw_install_yamls_unmatched_vars | length == 0, msg=cifmw_install_yamls_vars contains a variable that is not defined in install_yamls Makefile nor cifmw_install_yamls_whitelisted_vars: {{ _cifmw_install_yamls_unmatched_vars | join(', ')}}, quiet=True] *** 2026-01-22 16:15:08,162 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.204) 0:00:58.266 ****** 2026-01-22 16:15:08,162 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.204) 0:00:58.265 ****** 2026-01-22 16:15:08,196 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:08,218 p=31411 u=zuul n=ansible | TASK [install_yamls : Generate /home/zuul/ci-framework-data/artifacts/install_yamls.sh dest={{ cifmw_install_yamls_out_dir }}/{{ cifmw_install_yamls_envfile }}, content={% for k,v in cifmw_install_yamls_environment.items() %} export {{ k }}={{ v }} {% endfor %}, mode=0644] *** 2026-01-22 16:15:08,218 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.055) 0:00:58.322 ****** 2026-01-22 16:15:08,218 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.055) 0:00:58.321 ****** 2026-01-22 16:15:08,648 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:08,662 p=31411 u=zuul n=ansible | TASK [install_yamls : Set install_yamls default values cifmw_install_yamls_defaults={{ get_makefiles_env_output.makefiles_values | combine(cifmw_install_yamls_environment) }}, cacheable=True] *** 2026-01-22 16:15:08,662 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.444) 0:00:58.766 ****** 2026-01-22 16:15:08,662 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.444) 0:00:58.765 ****** 2026-01-22 16:15:08,698 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:08,709 p=31411 u=zuul n=ansible | TASK [install_yamls : Show the env structure var=cifmw_install_yamls_environment] *** 2026-01-22 16:15:08,709 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.046) 0:00:58.813 ****** 2026-01-22 16:15:08,709 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.046) 0:00:58.812 ****** 2026-01-22 16:15:08,738 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_install_yamls_environment: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' INSTALL_CERT_MANAGER: false NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm 2026-01-22 16:15:08,746 p=31411 u=zuul n=ansible | TASK [install_yamls : Show the env structure defaults var=cifmw_install_yamls_defaults] *** 2026-01-22 16:15:08,746 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.037) 0:00:58.850 ****** 2026-01-22 16:15:08,746 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.037) 0:00:58.849 ****** 2026-01-22 16:15:08,777 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_install_yamls_defaults: ADOPTED_EXTERNAL_NETWORK: 172.21.1.0/24 ADOPTED_INTERNALAPI_NETWORK: 172.17.1.0/24 ADOPTED_STORAGEMGMT_NETWORK: 172.20.1.0/24 ADOPTED_STORAGE_NETWORK: 172.18.1.0/24 ADOPTED_TENANT_NETWORK: 172.9.1.0/24 ANSIBLEEE: config/samples/_v1beta1_ansibleee.yaml ANSIBLEEE_BRANCH: main ANSIBLEEE_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-ansibleee-operator/config/samples/_v1beta1_ansibleee.yaml ANSIBLEEE_IMG: quay.io/openstack-k8s-operators/openstack-ansibleee-operator-index:latest ANSIBLEEE_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-ansibleee-operator/kuttl-test.yaml ANSIBLEEE_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-ansibleee-operator/test/kuttl/tests ANSIBLEEE_KUTTL_NAMESPACE: ansibleee-kuttl-tests ANSIBLEEE_REPO: https://github.com/openstack-k8s-operators/openstack-ansibleee-operator ANSIBLEE_COMMIT_HASH: '' BARBICAN: config/samples/barbican_v1beta1_barbican.yaml BARBICAN_BRANCH: main BARBICAN_COMMIT_HASH: '' BARBICAN_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/barbican-operator/config/samples/barbican_v1beta1_barbican.yaml BARBICAN_DEPL_IMG: unused BARBICAN_IMG: quay.io/openstack-k8s-operators/barbican-operator-index:latest BARBICAN_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/barbican-operator/kuttl-test.yaml BARBICAN_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/barbican-operator/test/kuttl/tests BARBICAN_KUTTL_NAMESPACE: barbican-kuttl-tests BARBICAN_REPO: https://github.com/openstack-k8s-operators/barbican-operator.git BARBICAN_SERVICE_ENABLED: 'true' BARBICAN_SIMPLE_CRYPTO_ENCRYPTION_KEY: sE**********U= BAREMETAL_BRANCH: main BAREMETAL_COMMIT_HASH: '' BAREMETAL_IMG: quay.io/openstack-k8s-operators/openstack-baremetal-operator-index:latest BAREMETAL_OS_CONTAINER_IMG: '' BAREMETAL_OS_IMG: '' BAREMETAL_OS_IMG_TYPE: '' BAREMETAL_REPO: https://github.com/openstack-k8s-operators/openstack-baremetal-operator.git BAREMETAL_TIMEOUT: 20m BASH_IMG: quay.io/openstack-k8s-operators/bash:latest BGP_ASN: '64999' BGP_LEAF_1: 100.65.4.1 BGP_LEAF_2: 100.64.4.1 BGP_OVN_ROUTING: 'false' BGP_PEER_ASN: '64999' BGP_SOURCE_IP: 172.30.4.2 BGP_SOURCE_IP6: f00d:f00d:f00d:f00d:f00d:f00d:f00d:42 BMAAS_BRIDGE_IPV4_PREFIX: 172.20.1.2/24 BMAAS_BRIDGE_IPV6_PREFIX: fd00:bbbb::2/64 BMAAS_INSTANCE_DISK_SIZE: '20' BMAAS_INSTANCE_MEMORY: '4096' BMAAS_INSTANCE_NAME_PREFIX: crc-bmaas BMAAS_INSTANCE_NET_MODEL: virtio BMAAS_INSTANCE_OS_VARIANT: centos-stream9 BMAAS_INSTANCE_VCPUS: '2' BMAAS_INSTANCE_VIRT_TYPE: kvm BMAAS_IPV4: 'true' BMAAS_IPV6: 'false' BMAAS_LIBVIRT_USER: sushyemu BMAAS_METALLB_ADDRESS_POOL: 172.20.1.64/26 BMAAS_METALLB_POOL_NAME: baremetal BMAAS_NETWORK_IPV4_PREFIX: 172.20.1.1/24 BMAAS_NETWORK_IPV6_PREFIX: fd00:bbbb::1/64 BMAAS_NETWORK_NAME: crc-bmaas BMAAS_NODE_COUNT: '1' BMAAS_OCP_INSTANCE_NAME: crc BMAAS_REDFISH_PASSWORD: password BMAAS_REDFISH_USERNAME: admin BMAAS_ROUTE_LIBVIRT_NETWORKS: crc-bmaas,crc,default BMAAS_SUSHY_EMULATOR_DRIVER: libvirt BMAAS_SUSHY_EMULATOR_IMAGE: quay.io/metal3-io/sushy-tools:latest BMAAS_SUSHY_EMULATOR_NAMESPACE: sushy-emulator BMAAS_SUSHY_EMULATOR_OS_CLIENT_CONFIG_FILE: /etc/openstack/clouds.yaml BMAAS_SUSHY_EMULATOR_OS_CLOUD: openstack BMH_NAMESPACE: openstack BMO_BRANCH: release-0.9 BMO_CLEANUP: 'true' BMO_COMMIT_HASH: '' BMO_IPA_BRANCH: stable/2024.1 BMO_IRONIC_HOST: 192.168.122.10 BMO_PROVISIONING_INTERFACE: '' BMO_REPO: https://github.com/metal3-io/baremetal-operator BMO_SETUP: false BMO_SETUP_ROUTE_REPLACE: 'true' BM_CTLPLANE_INTERFACE: enp1s0 BM_INSTANCE_MEMORY: '8192' BM_INSTANCE_NAME_PREFIX: edpm-compute-baremetal BM_INSTANCE_NAME_SUFFIX: '0' BM_NETWORK_NAME: default BM_NODE_COUNT: '1' BM_ROOT_PASSWORD: '' BM_ROOT_PASSWORD_SECRET: '' CEILOMETER_CENTRAL_DEPL_IMG: unused CEILOMETER_NOTIFICATION_DEPL_IMG: unused CEPH_BRANCH: release-1.15 CEPH_CLIENT: /home/zuul/ci-framework-data/artifacts/manifests/operator/rook/deploy/examples/toolbox.yaml CEPH_COMMON: /home/zuul/ci-framework-data/artifacts/manifests/operator/rook/deploy/examples/common.yaml CEPH_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/rook/deploy/examples/cluster-test.yaml CEPH_CRDS: /home/zuul/ci-framework-data/artifacts/manifests/operator/rook/deploy/examples/crds.yaml CEPH_IMG: quay.io/ceph/demo:latest-squid CEPH_OP: /home/zuul/ci-framework-data/artifacts/manifests/operator/rook/deploy/examples/operator-openshift.yaml CEPH_REPO: https://github.com/rook/rook.git CERTMANAGER_TIMEOUT: 300s CHECKOUT_FROM_OPENSTACK_REF: 'true' CINDER: config/samples/cinder_v1beta1_cinder.yaml CINDERAPI_DEPL_IMG: unused CINDERBKP_DEPL_IMG: unused CINDERSCH_DEPL_IMG: unused CINDERVOL_DEPL_IMG: unused CINDER_BRANCH: main CINDER_COMMIT_HASH: '' CINDER_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/cinder-operator/config/samples/cinder_v1beta1_cinder.yaml CINDER_IMG: quay.io/openstack-k8s-operators/cinder-operator-index:latest CINDER_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/cinder-operator/kuttl-test.yaml CINDER_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/cinder-operator/test/kuttl/tests CINDER_KUTTL_NAMESPACE: cinder-kuttl-tests CINDER_REPO: https://github.com/openstack-k8s-operators/cinder-operator.git CLEANUP_DIR_CMD: rm -Rf CRC_BGP_NIC_1_MAC: '52:54:00:11:11:11' CRC_BGP_NIC_2_MAC: '52:54:00:11:11:12' CRC_HTTPS_PROXY: '' CRC_HTTP_PROXY: '' CRC_STORAGE_NAMESPACE: crc-storage CRC_STORAGE_RETRIES: '3' CRC_URL: '''https://developers.redhat.com/content-gateway/rest/mirror/pub/openshift-v4/clients/crc/latest/crc-linux-amd64.tar.xz''' CRC_VERSION: latest DATAPLANE_ANSIBLE_SECRET: dataplane-ansible-ssh-private-key-secret DATAPLANE_ANSIBLE_USER: '' DATAPLANE_COMPUTE_IP: 192.168.122.100 DATAPLANE_CONTAINER_PREFIX: openstack DATAPLANE_CONTAINER_TAG: current-podified DATAPLANE_CUSTOM_SERVICE_RUNNER_IMG: quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest DATAPLANE_DEFAULT_GW: 192.168.122.1 DATAPLANE_EXTRA_NOVA_CONFIG_FILE: /dev/null DATAPLANE_GROWVOLS_ARGS: /=8GB /tmp=1GB /home=1GB /var=100% DATAPLANE_KUSTOMIZE_SCENARIO: preprovisioned DATAPLANE_NETWORKER_IP: 192.168.122.200 DATAPLANE_NETWORK_INTERFACE_NAME: eth0 DATAPLANE_NOVA_NFS_PATH: '' DATAPLANE_NTP_SERVER: pool.ntp.org DATAPLANE_PLAYBOOK: osp.edpm.download_cache DATAPLANE_REGISTRY_URL: quay.io/podified-antelope-centos9 DATAPLANE_RUNNER_IMG: '' DATAPLANE_SERVER_ROLE: compute DATAPLANE_SSHD_ALLOWED_RANGES: '[''192.168.122.0/24'']' DATAPLANE_TIMEOUT: 30m DATAPLANE_TLS_ENABLED: 'true' DATAPLANE_TOTAL_NETWORKER_NODES: '1' DATAPLANE_TOTAL_NODES: '1' DBSERVICE: galera DESIGNATE: config/samples/designate_v1beta1_designate.yaml DESIGNATE_BRANCH: main DESIGNATE_COMMIT_HASH: '' DESIGNATE_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/designate-operator/config/samples/designate_v1beta1_designate.yaml DESIGNATE_IMG: quay.io/openstack-k8s-operators/designate-operator-index:latest DESIGNATE_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/designate-operator/kuttl-test.yaml DESIGNATE_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/designate-operator/test/kuttl/tests DESIGNATE_KUTTL_NAMESPACE: designate-kuttl-tests DESIGNATE_REPO: https://github.com/openstack-k8s-operators/designate-operator.git DNSDATA: config/samples/network_v1beta1_dnsdata.yaml DNSDATA_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/config/samples/network_v1beta1_dnsdata.yaml DNSMASQ: config/samples/network_v1beta1_dnsmasq.yaml DNSMASQ_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/config/samples/network_v1beta1_dnsmasq.yaml DNS_DEPL_IMG: unused DNS_DOMAIN: localdomain DOWNLOAD_TOOLS_SELECTION: all EDPM_ATTACH_EXTNET: 'true' EDPM_COMPUTE_ADDITIONAL_HOST_ROUTES: '''[]''' EDPM_COMPUTE_ADDITIONAL_NETWORKS: '''[]''' EDPM_COMPUTE_CELLS: '1' EDPM_COMPUTE_CEPH_ENABLED: 'true' EDPM_COMPUTE_CEPH_NOVA: 'true' EDPM_COMPUTE_DHCP_AGENT_ENABLED: 'true' EDPM_COMPUTE_SRIOV_ENABLED: 'true' EDPM_COMPUTE_SUFFIX: '0' EDPM_CONFIGURE_DEFAULT_ROUTE: 'true' EDPM_CONFIGURE_HUGEPAGES: 'false' EDPM_CONFIGURE_NETWORKING: 'true' EDPM_FIRSTBOOT_EXTRA: /tmp/edpm-firstboot-extra EDPM_NETWORKER_SUFFIX: '0' EDPM_TOTAL_NETWORKERS: '1' EDPM_TOTAL_NODES: '1' GALERA_REPLICAS: '' GENERATE_SSH_KEYS: 'true' GIT_CLONE_OPTS: '' GLANCE: config/samples/glance_v1beta1_glance.yaml GLANCEAPI_DEPL_IMG: unused GLANCE_BRANCH: main GLANCE_COMMIT_HASH: '' GLANCE_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/glance-operator/config/samples/glance_v1beta1_glance.yaml GLANCE_IMG: quay.io/openstack-k8s-operators/glance-operator-index:latest GLANCE_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/glance-operator/kuttl-test.yaml GLANCE_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/glance-operator/test/kuttl/tests GLANCE_KUTTL_NAMESPACE: glance-kuttl-tests GLANCE_REPO: https://github.com/openstack-k8s-operators/glance-operator.git HEAT: config/samples/heat_v1beta1_heat.yaml HEATAPI_DEPL_IMG: unused HEATCFNAPI_DEPL_IMG: unused HEATENGINE_DEPL_IMG: unused HEAT_AUTH_ENCRYPTION_KEY: 76**********f0 HEAT_BRANCH: main HEAT_COMMIT_HASH: '' HEAT_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/heat-operator/config/samples/heat_v1beta1_heat.yaml HEAT_IMG: quay.io/openstack-k8s-operators/heat-operator-index:latest HEAT_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/heat-operator/kuttl-test.yaml HEAT_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/heat-operator/test/kuttl/tests HEAT_KUTTL_NAMESPACE: heat-kuttl-tests HEAT_REPO: https://github.com/openstack-k8s-operators/heat-operator.git HEAT_SERVICE_ENABLED: 'true' HORIZON: config/samples/horizon_v1beta1_horizon.yaml HORIZON_BRANCH: main HORIZON_COMMIT_HASH: '' HORIZON_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/horizon-operator/config/samples/horizon_v1beta1_horizon.yaml HORIZON_DEPL_IMG: unused HORIZON_IMG: quay.io/openstack-k8s-operators/horizon-operator-index:latest HORIZON_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/horizon-operator/kuttl-test.yaml HORIZON_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/horizon-operator/test/kuttl/tests HORIZON_KUTTL_NAMESPACE: horizon-kuttl-tests HORIZON_REPO: https://github.com/openstack-k8s-operators/horizon-operator.git INFRA_BRANCH: main INFRA_COMMIT_HASH: '' INFRA_IMG: quay.io/openstack-k8s-operators/infra-operator-index:latest INFRA_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/kuttl-test.yaml INFRA_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/test/kuttl/tests INFRA_KUTTL_NAMESPACE: infra-kuttl-tests INFRA_REPO: https://github.com/openstack-k8s-operators/infra-operator.git INSTALL_CERT_MANAGER: false INSTALL_NMSTATE: true || false INSTALL_NNCP: true || false INTERNALAPI_HOST_ROUTES: '' IPV6_LAB_IPV4_NETWORK_IPADDRESS: 172.30.0.1/24 IPV6_LAB_IPV6_NETWORK_IPADDRESS: fd00:abcd:abcd:fc00::1/64 IPV6_LAB_LIBVIRT_STORAGE_POOL: default IPV6_LAB_MANAGE_FIREWALLD: 'true' IPV6_LAB_NAT64_HOST_IPV4: 172.30.0.2/24 IPV6_LAB_NAT64_HOST_IPV6: fd00:abcd:abcd:fc00::2/64 IPV6_LAB_NAT64_INSTANCE_NAME: nat64-router IPV6_LAB_NAT64_IPV6_NETWORK: fd00:abcd:abcd:fc00::/64 IPV6_LAB_NAT64_TAYGA_DYNAMIC_POOL: 192.168.255.0/24 IPV6_LAB_NAT64_TAYGA_IPV4: 192.168.255.1 IPV6_LAB_NAT64_TAYGA_IPV6: fd00:abcd:abcd:fc00::3 IPV6_LAB_NAT64_TAYGA_IPV6_PREFIX: fd00:abcd:abcd:fcff::/96 IPV6_LAB_NAT64_UPDATE_PACKAGES: 'false' IPV6_LAB_NETWORK_NAME: nat64 IPV6_LAB_SNO_CLUSTER_NETWORK: fd00:abcd:0::/48 IPV6_LAB_SNO_HOST_IP: fd00:abcd:abcd:fc00::11 IPV6_LAB_SNO_HOST_PREFIX: '64' IPV6_LAB_SNO_INSTANCE_NAME: sno IPV6_LAB_SNO_MACHINE_NETWORK: fd00:abcd:abcd:fc00::/64 IPV6_LAB_SNO_OCP_MIRROR_URL: https://mirror.openshift.com/pub/openshift-v4/clients/ocp IPV6_LAB_SNO_OCP_VERSION: latest-4.14 IPV6_LAB_SNO_SERVICE_NETWORK: fd00:abcd:abcd:fc03::/112 IPV6_LAB_SSH_PUB_KEY: /home/zuul/.ssh/id_rsa.pub IPV6_LAB_WORK_DIR: /home/zuul/.ipv6lab IRONIC: config/samples/ironic_v1beta1_ironic.yaml IRONICAPI_DEPL_IMG: unused IRONICCON_DEPL_IMG: unused IRONICINS_DEPL_IMG: unused IRONICNAG_DEPL_IMG: unused IRONICPXE_DEPL_IMG: unused IRONIC_BRANCH: main IRONIC_COMMIT_HASH: '' IRONIC_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ironic-operator/config/samples/ironic_v1beta1_ironic.yaml IRONIC_IMAGE: quay.io/metal3-io/ironic IRONIC_IMAGE_TAG: release-24.1 IRONIC_IMG: quay.io/openstack-k8s-operators/ironic-operator-index:latest IRONIC_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/ironic-operator/kuttl-test.yaml IRONIC_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ironic-operator/test/kuttl/tests IRONIC_KUTTL_NAMESPACE: ironic-kuttl-tests IRONIC_REPO: https://github.com/openstack-k8s-operators/ironic-operator.git KEYSTONEAPI: config/samples/keystone_v1beta1_keystoneapi.yaml KEYSTONEAPI_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/keystone-operator/config/samples/keystone_v1beta1_keystoneapi.yaml KEYSTONEAPI_DEPL_IMG: unused KEYSTONE_BRANCH: main KEYSTONE_COMMIT_HASH: '' KEYSTONE_FEDERATION_CLIENT_SECRET: CO**********6f KEYSTONE_FEDERATION_CRYPTO_PASSPHRASE: openstack KEYSTONE_IMG: quay.io/openstack-k8s-operators/keystone-operator-index:latest KEYSTONE_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/keystone-operator/kuttl-test.yaml KEYSTONE_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/keystone-operator/test/kuttl/tests KEYSTONE_KUTTL_NAMESPACE: keystone-kuttl-tests KEYSTONE_REPO: https://github.com/openstack-k8s-operators/keystone-operator.git KUBEADMIN_PWD: '12345678' LIBVIRT_SECRET: libvirt-secret LOKI_DEPLOY_MODE: openshift-network LOKI_DEPLOY_NAMESPACE: netobserv LOKI_DEPLOY_SIZE: 1x.demo LOKI_NAMESPACE: openshift-operators-redhat LOKI_OPERATOR_GROUP: openshift-operators-redhat-loki LOKI_SUBSCRIPTION: loki-operator LVMS_CR: '1' MANILA: config/samples/manila_v1beta1_manila.yaml MANILAAPI_DEPL_IMG: unused MANILASCH_DEPL_IMG: unused MANILASHARE_DEPL_IMG: unused MANILA_BRANCH: main MANILA_COMMIT_HASH: '' MANILA_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/manila-operator/config/samples/manila_v1beta1_manila.yaml MANILA_IMG: quay.io/openstack-k8s-operators/manila-operator-index:latest MANILA_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/manila-operator/kuttl-test.yaml MANILA_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/manila-operator/test/kuttl/tests MANILA_KUTTL_NAMESPACE: manila-kuttl-tests MANILA_REPO: https://github.com/openstack-k8s-operators/manila-operator.git MANILA_SERVICE_ENABLED: 'true' MARIADB: config/samples/mariadb_v1beta1_galera.yaml MARIADB_BRANCH: main MARIADB_CHAINSAW_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/mariadb-operator/test/chainsaw/config.yaml MARIADB_CHAINSAW_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/mariadb-operator/test/chainsaw/tests MARIADB_CHAINSAW_NAMESPACE: mariadb-chainsaw-tests MARIADB_COMMIT_HASH: '' MARIADB_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/mariadb-operator/config/samples/mariadb_v1beta1_galera.yaml MARIADB_DEPL_IMG: unused MARIADB_IMG: quay.io/openstack-k8s-operators/mariadb-operator-index:latest MARIADB_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/mariadb-operator/kuttl-test.yaml MARIADB_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/mariadb-operator/test/kuttl/tests MARIADB_KUTTL_NAMESPACE: mariadb-kuttl-tests MARIADB_REPO: https://github.com/openstack-k8s-operators/mariadb-operator.git MEMCACHED: config/samples/memcached_v1beta1_memcached.yaml MEMCACHED_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/config/samples/memcached_v1beta1_memcached.yaml MEMCACHED_DEPL_IMG: unused METADATA_SHARED_SECRET: '12**********42' METALLB_IPV6_POOL: fd00:aaaa::80-fd00:aaaa::90 METALLB_POOL: 192.168.122.80-192.168.122.90 MICROSHIFT: '0' NAMESPACE: openstack NETCONFIG: config/samples/network_v1beta1_netconfig.yaml NETCONFIG_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator/config/samples/network_v1beta1_netconfig.yaml NETCONFIG_DEPL_IMG: unused NETOBSERV_DEPLOY_NAMESPACE: netobserv NETOBSERV_NAMESPACE: openshift-netobserv-operator NETOBSERV_OPERATOR_GROUP: openshift-netobserv-operator-net NETOBSERV_SUBSCRIPTION: netobserv-operator NETWORK_BGP: 'false' NETWORK_DESIGNATE_ADDRESS_PREFIX: 172.28.0 NETWORK_DESIGNATE_EXT_ADDRESS_PREFIX: 172.50.0 NETWORK_INTERNALAPI_ADDRESS_PREFIX: 172.17.0 NETWORK_ISOLATION: 'true' NETWORK_ISOLATION_INSTANCE_NAME: crc NETWORK_ISOLATION_IPV4: 'true' NETWORK_ISOLATION_IPV4_ADDRESS: 172.16.1.1/24 NETWORK_ISOLATION_IPV4_NAT: 'true' NETWORK_ISOLATION_IPV6: 'false' NETWORK_ISOLATION_IPV6_ADDRESS: fd00:aaaa::1/64 NETWORK_ISOLATION_IP_ADDRESS: 192.168.122.10 NETWORK_ISOLATION_MAC: '52:54:00:11:11:10' NETWORK_ISOLATION_NETWORK_NAME: net-iso NETWORK_ISOLATION_NET_NAME: default NETWORK_ISOLATION_USE_DEFAULT_NETWORK: 'true' NETWORK_MTU: '1500' NETWORK_STORAGEMGMT_ADDRESS_PREFIX: 172.20.0 NETWORK_STORAGE_ADDRESS_PREFIX: 172.18.0 NETWORK_STORAGE_MACVLAN: '' NETWORK_TENANT_ADDRESS_PREFIX: 172.19.0 NETWORK_VLAN_START: '20' NETWORK_VLAN_STEP: '1' NEUTRONAPI: config/samples/neutron_v1beta1_neutronapi.yaml NEUTRONAPI_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/neutron-operator/config/samples/neutron_v1beta1_neutronapi.yaml NEUTRONAPI_DEPL_IMG: unused NEUTRON_BRANCH: '' NEUTRON_COMMIT_HASH: '' NEUTRON_IMG: quay.io/openstack-k8s-operators/neutron-operator-index:latest NEUTRON_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/neutron-operator/kuttl-test.yaml NEUTRON_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/neutron-operator/test/kuttl/tests NEUTRON_KUTTL_NAMESPACE: neutron-kuttl-tests NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NFS_HOME: /home/nfs NMSTATE_NAMESPACE: openshift-nmstate NMSTATE_OPERATOR_GROUP: openshift-nmstate-tn6k8 NMSTATE_SUBSCRIPTION: kubernetes-nmstate-operator NNCP_ADDITIONAL_HOST_ROUTES: '' NNCP_BGP_1_INTERFACE: enp7s0 NNCP_BGP_1_IP_ADDRESS: 100.65.4.2 NNCP_BGP_2_INTERFACE: enp8s0 NNCP_BGP_2_IP_ADDRESS: 100.64.4.2 NNCP_BRIDGE: ospbr NNCP_CLEANUP_TIMEOUT: 120s NNCP_CTLPLANE_IPV6_ADDRESS_PREFIX: 'fd00:aaaa::' NNCP_CTLPLANE_IPV6_ADDRESS_SUFFIX: '10' NNCP_CTLPLANE_IP_ADDRESS_PREFIX: 192.168.122 NNCP_CTLPLANE_IP_ADDRESS_SUFFIX: '10' NNCP_DNS_SERVER: 192.168.122.1 NNCP_DNS_SERVER_IPV6: fd00:aaaa::1 NNCP_GATEWAY: 192.168.122.1 NNCP_GATEWAY_IPV6: fd00:aaaa::1 NNCP_INTERFACE: enp6s0 NNCP_NODES: '' NNCP_TIMEOUT: 240s NOVA: config/samples/nova_v1beta1_nova_collapsed_cell.yaml NOVA_BRANCH: main NOVA_COMMIT_HASH: '' NOVA_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/nova-operator/config/samples/nova_v1beta1_nova_collapsed_cell.yaml NOVA_IMG: quay.io/openstack-k8s-operators/nova-operator-index:latest NOVA_REPO: https://github.com/openstack-k8s-operators/nova-operator.git NUMBER_OF_INSTANCES: '1' OCP_NETWORK_NAME: crc OCTAVIA: config/samples/octavia_v1beta1_octavia.yaml OCTAVIA_BRANCH: main OCTAVIA_COMMIT_HASH: '' OCTAVIA_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/octavia-operator/config/samples/octavia_v1beta1_octavia.yaml OCTAVIA_IMG: quay.io/openstack-k8s-operators/octavia-operator-index:latest OCTAVIA_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/octavia-operator/kuttl-test.yaml OCTAVIA_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/octavia-operator/test/kuttl/tests OCTAVIA_KUTTL_NAMESPACE: octavia-kuttl-tests OCTAVIA_REPO: https://github.com/openstack-k8s-operators/octavia-operator.git OKD: 'false' OPENSTACK_BRANCH: main OPENSTACK_BUNDLE_IMG: quay.io/openstack-k8s-operators/openstack-operator-bundle:latest OPENSTACK_COMMIT_HASH: '' OPENSTACK_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-operator/config/samples/core_v1beta1_openstackcontrolplane_galera_network_isolation.yaml OPENSTACK_CRDS_DIR: openstack_crds OPENSTACK_CTLPLANE: config/samples/core_v1beta1_openstackcontrolplane_galera_network_isolation.yaml OPENSTACK_IMG: quay.io/openstack-k8s-operators/openstack-operator-index:latest OPENSTACK_K8S_BRANCH: main OPENSTACK_K8S_TAG: latest OPENSTACK_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-operator/kuttl-test.yaml OPENSTACK_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/openstack-operator/test/kuttl/tests OPENSTACK_KUTTL_NAMESPACE: openstack-kuttl-tests OPENSTACK_NEUTRON_CUSTOM_CONF: '' OPENSTACK_REPO: https://github.com/openstack-k8s-operators/openstack-operator.git OPENSTACK_STORAGE_BUNDLE_IMG: quay.io/openstack-k8s-operators/openstack-operator-storage-bundle:latest OPERATOR_BASE_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator OPERATOR_CHANNEL: '' OPERATOR_NAMESPACE: openstack-operators OPERATOR_SOURCE: '' OPERATOR_SOURCE_NAMESPACE: '' OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm OVNCONTROLLER: config/samples/ovn_v1beta1_ovncontroller.yaml OVNCONTROLLER_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ovn-operator/config/samples/ovn_v1beta1_ovncontroller.yaml OVNCONTROLLER_NMAP: 'true' OVNDBS: config/samples/ovn_v1beta1_ovndbcluster.yaml OVNDBS_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ovn-operator/config/samples/ovn_v1beta1_ovndbcluster.yaml OVNNORTHD: config/samples/ovn_v1beta1_ovnnorthd.yaml OVNNORTHD_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ovn-operator/config/samples/ovn_v1beta1_ovnnorthd.yaml OVN_BRANCH: main OVN_COMMIT_HASH: '' OVN_IMG: quay.io/openstack-k8s-operators/ovn-operator-index:latest OVN_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/ovn-operator/kuttl-test.yaml OVN_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/ovn-operator/test/kuttl/tests OVN_KUTTL_NAMESPACE: ovn-kuttl-tests OVN_REPO: https://github.com/openstack-k8s-operators/ovn-operator.git PASSWORD: '12**********78' PLACEMENTAPI: config/samples/placement_v1beta1_placementapi.yaml PLACEMENTAPI_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/placement-operator/config/samples/placement_v1beta1_placementapi.yaml PLACEMENTAPI_DEPL_IMG: unused PLACEMENT_BRANCH: main PLACEMENT_COMMIT_HASH: '' PLACEMENT_IMG: quay.io/openstack-k8s-operators/placement-operator-index:latest PLACEMENT_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/placement-operator/kuttl-test.yaml PLACEMENT_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/placement-operator/test/kuttl/tests PLACEMENT_KUTTL_NAMESPACE: placement-kuttl-tests PLACEMENT_REPO: https://github.com/openstack-k8s-operators/placement-operator.git PULL_SECRET: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/pull-secret.txt RABBITMQ: docs/examples/default-security-context/rabbitmq.yaml RABBITMQ_BRANCH: patches RABBITMQ_COMMIT_HASH: '' RABBITMQ_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/rabbitmq-operator/docs/examples/default-security-context/rabbitmq.yaml RABBITMQ_DEPL_IMG: unused RABBITMQ_IMG: quay.io/openstack-k8s-operators/rabbitmq-cluster-operator-index:latest RABBITMQ_REPO: https://github.com/openstack-k8s-operators/rabbitmq-cluster-operator.git REDHAT_OPERATORS: 'false' REDIS: config/samples/redis_v1beta1_redis.yaml REDIS_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/infra-operator-redis/config/samples/redis_v1beta1_redis.yaml REDIS_DEPL_IMG: unused RH_REGISTRY_PWD: '' RH_REGISTRY_USER: '' SECRET: os**********et SG_CORE_DEPL_IMG: unused STANDALONE_COMPUTE_DRIVER: libvirt STANDALONE_EXTERNAL_NET_PREFFIX: 172.21.0 STANDALONE_INTERNALAPI_NET_PREFIX: 172.17.0 STANDALONE_STORAGEMGMT_NET_PREFIX: 172.20.0 STANDALONE_STORAGE_NET_PREFIX: 172.18.0 STANDALONE_TENANT_NET_PREFIX: 172.19.0 STORAGEMGMT_HOST_ROUTES: '' STORAGE_CLASS: local-storage STORAGE_HOST_ROUTES: '' SWIFT: config/samples/swift_v1beta1_swift.yaml SWIFT_BRANCH: main SWIFT_COMMIT_HASH: '' SWIFT_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/swift-operator/config/samples/swift_v1beta1_swift.yaml SWIFT_IMG: quay.io/openstack-k8s-operators/swift-operator-index:latest SWIFT_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/swift-operator/kuttl-test.yaml SWIFT_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/swift-operator/test/kuttl/tests SWIFT_KUTTL_NAMESPACE: swift-kuttl-tests SWIFT_REPO: https://github.com/openstack-k8s-operators/swift-operator.git TELEMETRY: config/samples/telemetry_v1beta1_telemetry.yaml TELEMETRY_BRANCH: main TELEMETRY_COMMIT_HASH: '' TELEMETRY_CR: /home/zuul/ci-framework-data/artifacts/manifests/operator/telemetry-operator/config/samples/telemetry_v1beta1_telemetry.yaml TELEMETRY_IMG: quay.io/openstack-k8s-operators/telemetry-operator-index:latest TELEMETRY_KUTTL_BASEDIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/telemetry-operator TELEMETRY_KUTTL_CONF: /home/zuul/ci-framework-data/artifacts/manifests/operator/telemetry-operator/kuttl-test.yaml TELEMETRY_KUTTL_DIR: /home/zuul/ci-framework-data/artifacts/manifests/operator/telemetry-operator/test/kuttl/suites TELEMETRY_KUTTL_NAMESPACE: telemetry-kuttl-tests TELEMETRY_KUTTL_RELPATH: test/kuttl/suites TELEMETRY_REPO: https://github.com/openstack-k8s-operators/telemetry-operator.git TENANT_HOST_ROUTES: '' TIMEOUT: 300s TLS_ENABLED: 'false' tripleo_deploy: 'export REGISTRY_USER:' 2026-01-22 16:15:08,785 p=31411 u=zuul n=ansible | TASK [install_yamls : Generate make targets install_yamls_path={{ cifmw_install_yamls_repo }}, output_directory={{ cifmw_install_yamls_tasks_out }}] *** 2026-01-22 16:15:08,785 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.038) 0:00:58.889 ****** 2026-01-22 16:15:08,785 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:08 +0000 (0:00:00.038) 0:00:58.888 ****** 2026-01-22 16:15:09,110 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:09,118 p=31411 u=zuul n=ansible | TASK [install_yamls : Debug generate_make module var=cifmw_generate_makes] ***** 2026-01-22 16:15:09,118 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.332) 0:00:59.222 ****** 2026-01-22 16:15:09,118 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.332) 0:00:59.221 ****** 2026-01-22 16:15:09,156 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_generate_makes: changed: false debug: /home/zuul/src/github.com/openstack-k8s-operators/install_yamls/Makefile: - all - help - cleanup - deploy_cleanup - wait - crc_storage - crc_storage_cleanup - crc_storage_release - crc_storage_with_retries - crc_storage_cleanup_with_retries - operator_namespace - namespace - namespace_cleanup - input - input_cleanup - crc_bmo_setup - crc_bmo_cleanup - openstack_prep - openstack - openstack_wait - openstack_init - openstack_cleanup - openstack_repo - openstack_deploy_prep - openstack_deploy - openstack_wait_deploy - openstack_deploy_cleanup - openstack_update_run - update_services - update_system - openstack_patch_version - edpm_deploy_generate_keys - edpm_patch_ansible_runner_image - edpm_deploy_prep - edpm_deploy_cleanup - edpm_deploy - edpm_deploy_baremetal_prep - edpm_deploy_baremetal - edpm_wait_deploy_baremetal - edpm_wait_deploy - edpm_register_dns - edpm_nova_discover_hosts - openstack_crds - openstack_crds_cleanup - edpm_deploy_networker_prep - edpm_deploy_networker_cleanup - edpm_deploy_networker - infra_prep - infra - infra_cleanup - dns_deploy_prep - dns_deploy - dns_deploy_cleanup - netconfig_deploy_prep - netconfig_deploy - netconfig_deploy_cleanup - memcached_deploy_prep - memcached_deploy - memcached_deploy_cleanup - keystone_prep - keystone - keystone_cleanup - keystone_deploy_prep - keystone_deploy - keystone_deploy_cleanup - barbican_prep - barbican - barbican_cleanup - barbican_deploy_prep - barbican_deploy - barbican_deploy_validate - barbican_deploy_cleanup - mariadb - mariadb_cleanup - mariadb_deploy_prep - mariadb_deploy - mariadb_deploy_cleanup - placement_prep - placement - placement_cleanup - placement_deploy_prep - placement_deploy - placement_deploy_cleanup - glance_prep - glance - glance_cleanup - glance_deploy_prep - glance_deploy - glance_deploy_cleanup - ovn_prep - ovn - ovn_cleanup - ovn_deploy_prep - ovn_deploy - ovn_deploy_cleanup - neutron_prep - neutron - neutron_cleanup - neutron_deploy_prep - neutron_deploy - neutron_deploy_cleanup - cinder_prep - cinder - cinder_cleanup - cinder_deploy_prep - cinder_deploy - cinder_deploy_cleanup - rabbitmq_prep - rabbitmq - rabbitmq_cleanup - rabbitmq_deploy_prep - rabbitmq_deploy - rabbitmq_deploy_cleanup - ironic_prep - ironic - ironic_cleanup - ironic_deploy_prep - ironic_deploy - ironic_deploy_cleanup - octavia_prep - octavia - octavia_cleanup - octavia_deploy_prep - octavia_deploy - octavia_deploy_cleanup - designate_prep - designate - designate_cleanup - designate_deploy_prep - designate_deploy - designate_deploy_cleanup - nova_prep - nova - nova_cleanup - nova_deploy_prep - nova_deploy - nova_deploy_cleanup - mariadb_kuttl_run - mariadb_kuttl - kuttl_db_prep - kuttl_db_cleanup - kuttl_common_prep - kuttl_common_cleanup - keystone_kuttl_run - keystone_kuttl - barbican_kuttl_run - barbican_kuttl - placement_kuttl_run - placement_kuttl - cinder_kuttl_run - cinder_kuttl - neutron_kuttl_run - neutron_kuttl - octavia_kuttl_run - octavia_kuttl - designate_kuttl - designate_kuttl_run - ovn_kuttl_run - ovn_kuttl - infra_kuttl_run - infra_kuttl - ironic_kuttl_run - ironic_kuttl - ironic_kuttl_crc - heat_kuttl_run - heat_kuttl - heat_kuttl_crc - ansibleee_kuttl_run - ansibleee_kuttl_cleanup - ansibleee_kuttl_prep - ansibleee_kuttl - glance_kuttl_run - glance_kuttl - manila_kuttl_run - manila_kuttl - swift_kuttl_run - swift_kuttl - horizon_kuttl_run - horizon_kuttl - openstack_kuttl_run - openstack_kuttl - mariadb_chainsaw_run - mariadb_chainsaw - horizon_prep - horizon - horizon_cleanup - horizon_deploy_prep - horizon_deploy - horizon_deploy_cleanup - heat_prep - heat - heat_cleanup - heat_deploy_prep - heat_deploy - heat_deploy_cleanup - ansibleee_prep - ansibleee - ansibleee_cleanup - baremetal_prep - baremetal - baremetal_cleanup - ceph_help - ceph - ceph_cleanup - rook_prep - rook - rook_deploy_prep - rook_deploy - rook_crc_disk - rook_cleanup - lvms - nmstate - nncp - nncp_cleanup - netattach - netattach_cleanup - metallb - metallb_config - metallb_config_cleanup - metallb_cleanup - loki - loki_cleanup - loki_deploy - loki_deploy_cleanup - netobserv - netobserv_cleanup - netobserv_deploy - netobserv_deploy_cleanup - manila_prep - manila - manila_cleanup - manila_deploy_prep - manila_deploy - manila_deploy_cleanup - telemetry_prep - telemetry - telemetry_cleanup - telemetry_deploy_prep - telemetry_deploy - telemetry_deploy_cleanup - telemetry_kuttl_run - telemetry_kuttl - swift_prep - swift - swift_cleanup - swift_deploy_prep - swift_deploy - swift_deploy_cleanup - certmanager - certmanager_cleanup - validate_marketplace - redis_deploy_prep - redis_deploy - redis_deploy_cleanup - set_slower_etcd_profile /home/zuul/src/github.com/openstack-k8s-operators/install_yamls/devsetup/Makefile: - help - download_tools - nfs - nfs_cleanup - crc - crc_cleanup - crc_scrub - crc_attach_default_interface - crc_attach_default_interface_cleanup - ipv6_lab_network - ipv6_lab_network_cleanup - ipv6_lab_nat64_router - ipv6_lab_nat64_router_cleanup - ipv6_lab_sno - ipv6_lab_sno_cleanup - ipv6_lab - ipv6_lab_cleanup - attach_default_interface - attach_default_interface_cleanup - network_isolation_bridge - network_isolation_bridge_cleanup - edpm_baremetal_compute - edpm_compute - edpm_compute_bootc - edpm_ansible_runner - edpm_computes_bgp - edpm_compute_repos - edpm_compute_cleanup - edpm_networker - edpm_networker_cleanup - edpm_deploy_instance - tripleo_deploy - standalone_deploy - standalone_sync - standalone - standalone_cleanup - standalone_snapshot - standalone_revert - cifmw_prepare - cifmw_cleanup - bmaas_network - bmaas_network_cleanup - bmaas_route_crc_and_crc_bmaas_networks - bmaas_route_crc_and_crc_bmaas_networks_cleanup - bmaas_crc_attach_network - bmaas_crc_attach_network_cleanup - bmaas_crc_baremetal_bridge - bmaas_crc_baremetal_bridge_cleanup - bmaas_baremetal_net_nad - bmaas_baremetal_net_nad_cleanup - bmaas_metallb - bmaas_metallb_cleanup - bmaas_virtual_bms - bmaas_virtual_bms_cleanup - bmaas_sushy_emulator - bmaas_sushy_emulator_cleanup - bmaas_sushy_emulator_wait - bmaas_generate_nodes_yaml - bmaas - bmaas_cleanup failed: false success: true 2026-01-22 16:15:09,166 p=31411 u=zuul n=ansible | TASK [install_yamls : Create the install_yamls parameters file dest={{ cifmw_basedir }}/artifacts/parameters/install-yamls-params.yml, content={{ { 'cifmw_install_yamls_environment': cifmw_install_yamls_environment, 'cifmw_install_yamls_defaults': cifmw_install_yamls_defaults } | to_nice_yaml }}, mode=0644] *** 2026-01-22 16:15:09,166 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.047) 0:00:59.270 ****** 2026-01-22 16:15:09,166 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.047) 0:00:59.268 ****** 2026-01-22 16:15:09,608 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:09,623 p=31411 u=zuul n=ansible | TASK [install_yamls : Create empty cifmw_install_yamls_environment if needed cifmw_install_yamls_environment={}] *** 2026-01-22 16:15:09,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.457) 0:00:59.727 ****** 2026-01-22 16:15:09,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.457) 0:00:59.726 ****** 2026-01-22 16:15:09,643 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:09,671 p=31411 u=zuul n=ansible | TASK [discover_latest_image : Get latest image url={{ cifmw_discover_latest_image_base_url }}, image_prefix={{ cifmw_discover_latest_image_qcow_prefix }}, images_file={{ cifmw_discover_latest_image_images_file }}] *** 2026-01-22 16:15:09,671 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.047) 0:00:59.775 ****** 2026-01-22 16:15:09,671 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:09 +0000 (0:00:00.047) 0:00:59.773 ****** 2026-01-22 16:15:10,025 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:10,032 p=31411 u=zuul n=ansible | TASK [discover_latest_image : Export facts accordingly cifmw_discovered_image_name={{ discovered_image['data']['image_name'] }}, cifmw_discovered_image_url={{ discovered_image['data']['image_url'] }}, cifmw_discovered_hash={{ discovered_image['data']['hash'] }}, cifmw_discovered_hash_algorithm={{ discovered_image['data']['hash_algorithm'] }}, cacheable=True] *** 2026-01-22 16:15:10,032 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.360) 0:01:00.136 ****** 2026-01-22 16:15:10,032 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.360) 0:01:00.134 ****** 2026-01-22 16:15:10,058 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:10,069 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Create artifacts with custom params mode=0644, dest={{ cifmw_basedir }}/artifacts/parameters/custom-params.yml, content={{ ci_framework_params | to_nice_yaml }}] *** 2026-01-22 16:15:10,069 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.037) 0:01:00.173 ****** 2026-01-22 16:15:10,069 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.037) 0:01:00.172 ****** 2026-01-22 16:15:10,454 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:10,480 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:15:10,480 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.410) 0:01:00.584 ****** 2026-01-22 16:15:10,480 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.410) 0:01:00.583 ****** 2026-01-22 16:15:10,543 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:10,555 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:15:10,555 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.074) 0:01:00.659 ****** 2026-01-22 16:15:10,555 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.074) 0:01:00.658 ****** 2026-01-22 16:15:10,624 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:10,642 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_infra _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:15:10,642 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.087) 0:01:00.746 ****** 2026-01-22 16:15:10,642 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.087) 0:01:00.745 ****** 2026-01-22 16:15:10,746 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/run_hook/tasks/playbook.yml for localhost => (item={'name': 'Download needed tools', 'inventory': 'localhost,', 'connection': 'local', 'type': 'playbook', 'source': '/home/zuul/src/github.com/openstack-k8s-operators/install_yamls/devsetup/download_tools.yaml'}) 2026-01-22 16:15:10,756 p=31411 u=zuul n=ansible | TASK [run_hook : Set playbook path for Download needed tools cifmw_basedir={{ _bdir }}, hook_name={{ _hook_name }}, playbook_path={{ _play | realpath }}, log_path={{ _bdir }}/logs/{{ step }}_{{ _hook_name }}.log, extra_vars=-e namespace={{ cifmw_openstack_namespace }} {%- if hook.extra_vars is defined and hook.extra_vars|length > 0 -%} {% for key,value in hook.extra_vars.items() -%} {%- if key == 'file' %} -e "@{{ value }}" {%- else %} -e "{{ key }}={{ value }}" {%- endif %} {%- endfor %} {%- endif %}] *** 2026-01-22 16:15:10,756 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.113) 0:01:00.860 ****** 2026-01-22 16:15:10,756 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.113) 0:01:00.859 ****** 2026-01-22 16:15:10,795 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:10,803 p=31411 u=zuul n=ansible | TASK [run_hook : Get file stat path={{ playbook_path }}] *********************** 2026-01-22 16:15:10,803 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.047) 0:01:00.907 ****** 2026-01-22 16:15:10,803 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.047) 0:01:00.906 ****** 2026-01-22 16:15:10,985 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:10,992 p=31411 u=zuul n=ansible | TASK [run_hook : Fail if playbook doesn't exist msg=Playbook {{ playbook_path }} doesn't seem to exist.] *** 2026-01-22 16:15:10,992 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.189) 0:01:01.096 ****** 2026-01-22 16:15:10,992 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:10 +0000 (0:00:00.189) 0:01:01.095 ****** 2026-01-22 16:15:11,006 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:11,013 p=31411 u=zuul n=ansible | TASK [run_hook : Get parameters files paths={{ (cifmw_basedir, 'artifacts/parameters') | path_join }}, file_type=file, patterns=*.yml] *** 2026-01-22 16:15:11,013 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.021) 0:01:01.117 ****** 2026-01-22 16:15:11,013 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.021) 0:01:01.116 ****** 2026-01-22 16:15:11,184 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:11,192 p=31411 u=zuul n=ansible | TASK [run_hook : Add parameters artifacts as extra variables extra_vars={{ extra_vars }} {% for file in cifmw_run_hook_parameters_files.files %} -e "@{{ file.path }}" {%- endfor %}] *** 2026-01-22 16:15:11,192 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.178) 0:01:01.296 ****** 2026-01-22 16:15:11,192 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.178) 0:01:01.295 ****** 2026-01-22 16:15:11,209 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:11,217 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure log directory exists path={{ log_path | dirname }}, state=directory, mode=0755] *** 2026-01-22 16:15:11,217 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.024) 0:01:01.321 ****** 2026-01-22 16:15:11,217 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.024) 0:01:01.320 ****** 2026-01-22 16:15:11,396 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:11,406 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure artifacts directory exists path={{ cifmw_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:15:11,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.189) 0:01:01.510 ****** 2026-01-22 16:15:11,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.189) 0:01:01.509 ****** 2026-01-22 16:15:11,594 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:11,608 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook without retry - Download needed tools] *************** 2026-01-22 16:15:11,609 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.202) 0:01:01.713 ****** 2026-01-22 16:15:11,609 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:11 +0000 (0:00:00.202) 0:01:01.711 ****** 2026-01-22 16:15:11,663 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_000_run_hook_without_retry.log 2026-01-22 16:15:46,482 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:46,497 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook with retry - Download needed tools] ****************** 2026-01-22 16:15:46,497 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:34.888) 0:01:36.601 ****** 2026-01-22 16:15:46,497 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:34.888) 0:01:36.600 ****** 2026-01-22 16:15:46,516 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:46,527 p=31411 u=zuul n=ansible | TASK [run_hook : Check if we have a file path={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:15:46,527 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.029) 0:01:36.631 ****** 2026-01-22 16:15:46,527 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.029) 0:01:36.629 ****** 2026-01-22 16:15:46,692 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:46,702 p=31411 u=zuul n=ansible | TASK [run_hook : Load generated content in main playbook file={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:15:46,702 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.175) 0:01:36.806 ****** 2026-01-22 16:15:46,702 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.175) 0:01:36.805 ****** 2026-01-22 16:15:46,717 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:46,760 p=31411 u=zuul n=ansible | PLAY [Prepare host virtualization] ********************************************* 2026-01-22 16:15:46,781 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:15:46,781 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.079) 0:01:36.885 ****** 2026-01-22 16:15:46,782 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.079) 0:01:36.884 ****** 2026-01-22 16:15:46,821 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:46,830 p=31411 u=zuul n=ansible | TASK [Ensure libvirt is present/configured name=libvirt_manager] *************** 2026-01-22 16:15:46,830 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.048) 0:01:36.934 ****** 2026-01-22 16:15:46,830 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.048) 0:01:36.932 ****** 2026-01-22 16:15:46,853 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:46,861 p=31411 u=zuul n=ansible | TASK [Perpare OpenShift provisioner node name=openshift_provisioner_node] ****** 2026-01-22 16:15:46,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.031) 0:01:36.965 ****** 2026-01-22 16:15:46,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.031) 0:01:36.964 ****** 2026-01-22 16:15:46,883 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:46,914 p=31411 u=zuul n=ansible | PLAY [Run cifmw_setup infra, build package, container and operators, deploy EDPM] *** 2026-01-22 16:15:46,948 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:15:46,948 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.087) 0:01:37.052 ****** 2026-01-22 16:15:46,948 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:46 +0000 (0:00:00.087) 0:01:37.051 ****** 2026-01-22 16:15:46,997 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:47,006 p=31411 u=zuul n=ansible | TASK [networking_mapper : Check for Networking Environment Definition file existence path={{ cifmw_networking_mapper_networking_env_def_path }}] *** 2026-01-22 16:15:47,006 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.058) 0:01:37.110 ****** 2026-01-22 16:15:47,006 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.058) 0:01:37.109 ****** 2026-01-22 16:15:47,185 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:47,201 p=31411 u=zuul n=ansible | TASK [networking_mapper : Check for Networking Definition file existance that=['_net_env_def_stat.stat.exists'], msg=Ensure that the Networking Environment Definition file exists in {{ cifmw_networking_mapper_networking_env_def_path }}, quiet=True] *** 2026-01-22 16:15:47,201 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.194) 0:01:37.305 ****** 2026-01-22 16:15:47,201 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.194) 0:01:37.304 ****** 2026-01-22 16:15:47,224 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,232 p=31411 u=zuul n=ansible | TASK [networking_mapper : Load the Networking Definition from file path={{ cifmw_networking_mapper_networking_env_def_path }}] *** 2026-01-22 16:15:47,232 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.031) 0:01:37.336 ****** 2026-01-22 16:15:47,232 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.031) 0:01:37.335 ****** 2026-01-22 16:15:47,265 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,273 p=31411 u=zuul n=ansible | TASK [networking_mapper : Set cifmw_networking_env_definition is present cifmw_networking_env_definition={{ _net_env_def_slurp['content'] | b64decode | from_yaml }}, cacheable=True] *** 2026-01-22 16:15:47,273 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.041) 0:01:37.377 ****** 2026-01-22 16:15:47,273 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.041) 0:01:37.376 ****** 2026-01-22 16:15:47,302 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,316 p=31411 u=zuul n=ansible | TASK [Deploy OCP using Hive name=hive] ***************************************** 2026-01-22 16:15:47,316 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.042) 0:01:37.420 ****** 2026-01-22 16:15:47,316 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.042) 0:01:37.418 ****** 2026-01-22 16:15:47,339 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,347 p=31411 u=zuul n=ansible | TASK [Prepare CRC name=rhol_crc] *********************************************** 2026-01-22 16:15:47,347 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.030) 0:01:37.451 ****** 2026-01-22 16:15:47,347 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.030) 0:01:37.449 ****** 2026-01-22 16:15:47,374 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,381 p=31411 u=zuul n=ansible | TASK [Deploy OpenShift cluster using dev-scripts name=devscripts] ************** 2026-01-22 16:15:47,381 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.034) 0:01:37.485 ****** 2026-01-22 16:15:47,381 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.034) 0:01:37.484 ****** 2026-01-22 16:15:47,405 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,414 p=31411 u=zuul n=ansible | TASK [openshift_login : Ensure output directory exists path={{ cifmw_openshift_login_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:15:47,414 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.032) 0:01:37.518 ****** 2026-01-22 16:15:47,414 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.032) 0:01:37.516 ****** 2026-01-22 16:15:47,612 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:47,620 p=31411 u=zuul n=ansible | TASK [openshift_login : OpenShift login _raw_params=login.yml] ***************** 2026-01-22 16:15:47,620 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.206) 0:01:37.724 ****** 2026-01-22 16:15:47,620 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.206) 0:01:37.722 ****** 2026-01-22 16:15:47,664 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/openshift_login/tasks/login.yml for localhost 2026-01-22 16:15:47,680 p=31411 u=zuul n=ansible | TASK [openshift_login : Check if the password file is present path={{ cifmw_openshift_login_password_file | default(cifmw_openshift_password_file) }}] *** 2026-01-22 16:15:47,680 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.060) 0:01:37.784 ****** 2026-01-22 16:15:47,680 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.060) 0:01:37.783 ****** 2026-01-22 16:15:47,701 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,712 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch user password content src={{ cifmw_openshift_login_password_file | default(cifmw_openshift_password_file) }}] *** 2026-01-22 16:15:47,712 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.031) 0:01:37.816 ****** 2026-01-22 16:15:47,712 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.031) 0:01:37.815 ****** 2026-01-22 16:15:47,739 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,750 p=31411 u=zuul n=ansible | TASK [openshift_login : Set user password as a fact cifmw_openshift_login_password={{ cifmw_openshift_login_password_file_slurp.content | b64decode }}, cacheable=True] *** 2026-01-22 16:15:47,750 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.038) 0:01:37.854 ****** 2026-01-22 16:15:47,750 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.038) 0:01:37.853 ****** 2026-01-22 16:15:47,779 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:47,790 p=31411 u=zuul n=ansible | TASK [openshift_login : Set role variables cifmw_openshift_login_kubeconfig={{ cifmw_openshift_login_kubeconfig | default(cifmw_openshift_kubeconfig) | default( ansible_env.KUBECONFIG if 'KUBECONFIG' in ansible_env else cifmw_openshift_login_kubeconfig_default_path ) | trim }}, cifmw_openshift_login_user={{ cifmw_openshift_login_user | default(cifmw_openshift_user) | default(omit) }}, cifmw_openshift_login_password={{********** cifmw_openshift_login_password | default(cifmw_openshift_password) | default(omit) }}, cifmw_openshift_login_api={{ cifmw_openshift_login_api | default(cifmw_openshift_api) | default(omit) }}, cifmw_openshift_login_cert_login={{ cifmw_openshift_login_cert_login | default(false)}}, cifmw_openshift_login_provided_token={{ cifmw_openshift_provided_token | default(omit) }}, cacheable=True] *** 2026-01-22 16:15:47,791 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.040) 0:01:37.894 ****** 2026-01-22 16:15:47,791 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.040) 0:01:37.893 ****** 2026-01-22 16:15:47,829 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:47,840 p=31411 u=zuul n=ansible | TASK [openshift_login : Check if kubeconfig exists path={{ cifmw_openshift_login_kubeconfig }}] *** 2026-01-22 16:15:47,840 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.049) 0:01:37.944 ****** 2026-01-22 16:15:47,840 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:47 +0000 (0:00:00.049) 0:01:37.943 ****** 2026-01-22 16:15:48,009 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:48,017 p=31411 u=zuul n=ansible | TASK [openshift_login : Assert that enough data is provided to log in to OpenShift that=cifmw_openshift_login_kubeconfig_stat.stat.exists or (cifmw_openshift_login_provided_token is defined and cifmw_openshift_login_provided_token != '') or ( (cifmw_openshift_login_user is defined) and (cifmw_openshift_login_password is defined) and (cifmw_openshift_login_api is defined) ), msg=If an existing kubeconfig is not provided user/pwd or provided/initial token and API URL must be given] *** 2026-01-22 16:15:48,017 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.177) 0:01:38.121 ****** 2026-01-22 16:15:48,017 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.177) 0:01:38.120 ****** 2026-01-22 16:15:48,040 p=31411 u=zuul n=ansible | ok: [localhost] => changed: false msg: All assertions passed 2026-01-22 16:15:48,048 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch kubeconfig content src={{ cifmw_openshift_login_kubeconfig }}] *** 2026-01-22 16:15:48,048 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.030) 0:01:38.152 ****** 2026-01-22 16:15:48,048 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.030) 0:01:38.150 ****** 2026-01-22 16:15:48,066 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:48,074 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch x509 key based users cifmw_openshift_login_key_based_users={{ ( cifmw_openshift_login_kubeconfig_content_b64.content | b64decode | from_yaml ). users | default([]) | selectattr('user.client-certificate-data', 'defined') | map(attribute="name") | map("split", "/") | map("first") }}, cacheable=True] *** 2026-01-22 16:15:48,074 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.026) 0:01:38.178 ****** 2026-01-22 16:15:48,074 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.026) 0:01:38.177 ****** 2026-01-22 16:15:48,095 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:48,105 p=31411 u=zuul n=ansible | TASK [openshift_login : Assign key based user if not provided and available cifmw_openshift_login_user={{ (cifmw_openshift_login_assume_cert_system_user | ternary('system:', '')) + (cifmw_openshift_login_key_based_users | map('replace', 'system:', '') | unique | first) }}, cifmw_openshift_login_cert_login=True, cacheable=True] *** 2026-01-22 16:15:48,106 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.031) 0:01:38.209 ****** 2026-01-22 16:15:48,106 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.031) 0:01:38.208 ****** 2026-01-22 16:15:48,130 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:48,141 p=31411 u=zuul n=ansible | TASK [openshift_login : Set the retry count cifmw_openshift_login_retries_cnt={{ 0 if cifmw_openshift_login_retries_cnt is undefined else cifmw_openshift_login_retries_cnt|int + 1 }}] *** 2026-01-22 16:15:48,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.035) 0:01:38.245 ****** 2026-01-22 16:15:48,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.035) 0:01:38.244 ****** 2026-01-22 16:15:48,164 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:48,174 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch token _raw_params=try_login.yml] ***************** 2026-01-22 16:15:48,174 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.033) 0:01:38.278 ****** 2026-01-22 16:15:48,175 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.033) 0:01:38.277 ****** 2026-01-22 16:15:48,202 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/openshift_login/tasks/try_login.yml for localhost 2026-01-22 16:15:48,217 p=31411 u=zuul n=ansible | TASK [openshift_login : Try get OpenShift access token _raw_params=oc whoami -t] *** 2026-01-22 16:15:48,218 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.043) 0:01:38.322 ****** 2026-01-22 16:15:48,218 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.043) 0:01:38.320 ****** 2026-01-22 16:15:48,236 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:48,247 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch OpenShift token output_dir={{ cifmw_openshift_login_basedir }}/artifacts, script=oc login {%- if cifmw_openshift_login_provided_token is not defined %} {%- if cifmw_openshift_login_user is defined %} -u {{ cifmw_openshift_login_user }} {%- endif %} {%- if cifmw_openshift_login_password is defined %} -p {{ cifmw_openshift_login_password }} {%- endif %} {% else %} --token={{ cifmw_openshift_login_provided_token }} {%- endif %} {%- if cifmw_openshift_login_skip_tls_verify|bool %} --insecure-skip-tls-verify=true {%- endif %} {%- if cifmw_openshift_login_api is defined %} {{ cifmw_openshift_login_api }} {%- endif %}] *** 2026-01-22 16:15:48,247 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.029) 0:01:38.351 ****** 2026-01-22 16:15:48,247 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.029) 0:01:38.349 ****** 2026-01-22 16:15:48,308 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_001_fetch_openshift.log 2026-01-22 16:15:48,679 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:48,688 p=31411 u=zuul n=ansible | TASK [openshift_login : Ensure kubeconfig is provided that=cifmw_openshift_login_kubeconfig != ""] *** 2026-01-22 16:15:48,688 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.441) 0:01:38.792 ****** 2026-01-22 16:15:48,688 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.441) 0:01:38.790 ****** 2026-01-22 16:15:48,717 p=31411 u=zuul n=ansible | ok: [localhost] => changed: false msg: All assertions passed 2026-01-22 16:15:48,725 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch new OpenShift access token _raw_params=oc whoami -t] *** 2026-01-22 16:15:48,725 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.037) 0:01:38.829 ****** 2026-01-22 16:15:48,725 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:48 +0000 (0:00:00.037) 0:01:38.827 ****** 2026-01-22 16:15:49,036 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:49,044 p=31411 u=zuul n=ansible | TASK [openshift_login : Set new OpenShift token cifmw_openshift_login_token={{ (not cifmw_openshift_login_new_token_out.skipped | default(false)) | ternary(cifmw_openshift_login_new_token_out.stdout, cifmw_openshift_login_whoami_out.stdout) }}, cacheable=True] *** 2026-01-22 16:15:49,044 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.319) 0:01:39.148 ****** 2026-01-22 16:15:49,044 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.319) 0:01:39.147 ****** 2026-01-22 16:15:49,076 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:49,083 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch OpenShift API URL _raw_params=oc whoami --show-server=true] *** 2026-01-22 16:15:49,084 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.039) 0:01:39.187 ****** 2026-01-22 16:15:49,084 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.039) 0:01:39.186 ****** 2026-01-22 16:15:49,354 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:49,361 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch OpenShift kubeconfig context _raw_params=oc whoami -c] *** 2026-01-22 16:15:49,362 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.277) 0:01:39.465 ****** 2026-01-22 16:15:49,362 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.277) 0:01:39.464 ****** 2026-01-22 16:15:49,615 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:49,623 p=31411 u=zuul n=ansible | TASK [openshift_login : Fetch OpenShift current user _raw_params=oc whoami] **** 2026-01-22 16:15:49,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.261) 0:01:39.727 ****** 2026-01-22 16:15:49,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.261) 0:01:39.726 ****** 2026-01-22 16:15:49,924 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:49,934 p=31411 u=zuul n=ansible | TASK [openshift_login : Set OpenShift user, context and API facts cifmw_openshift_login_api={{ cifmw_openshift_login_api_out.stdout }}, cifmw_openshift_login_context={{ cifmw_openshift_login_context_out.stdout }}, cifmw_openshift_login_user={{ _oauth_user }}, cifmw_openshift_kubeconfig={{ cifmw_openshift_login_kubeconfig }}, cifmw_openshift_api={{ cifmw_openshift_login_api_out.stdout }}, cifmw_openshift_context={{ cifmw_openshift_login_context_out.stdout }}, cifmw_openshift_user={{ _oauth_user }}, cifmw_openshift_token={{ cifmw_openshift_login_token | default(omit) }}, cifmw_install_yamls_environment={{ ( cifmw_install_yamls_environment | combine({'KUBECONFIG': cifmw_openshift_login_kubeconfig}) ) if cifmw_install_yamls_environment is defined else omit }}, cacheable=True] *** 2026-01-22 16:15:49,934 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.311) 0:01:40.038 ****** 2026-01-22 16:15:49,934 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.311) 0:01:40.037 ****** 2026-01-22 16:15:49,968 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:49,975 p=31411 u=zuul n=ansible | TASK [openshift_login : Create the openshift_login parameters file dest={{ cifmw_basedir }}/artifacts/parameters/openshift-login-params.yml, content={{ cifmw_openshift_login_params_content | from_yaml | to_nice_yaml }}, mode=0600] *** 2026-01-22 16:15:49,976 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.041) 0:01:40.079 ****** 2026-01-22 16:15:49,976 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:49 +0000 (0:00:00.041) 0:01:40.078 ****** 2026-01-22 16:15:50,366 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:50,375 p=31411 u=zuul n=ansible | TASK [openshift_login : Read the install yamls parameters file path={{ cifmw_basedir }}/artifacts/parameters/install-yamls-params.yml] *** 2026-01-22 16:15:50,376 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:50 +0000 (0:00:00.399) 0:01:40.479 ****** 2026-01-22 16:15:50,376 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:50 +0000 (0:00:00.399) 0:01:40.478 ****** 2026-01-22 16:15:50,625 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:50,636 p=31411 u=zuul n=ansible | TASK [openshift_login : Append the KUBECONFIG to the install yamls parameters content={{ cifmw_openshift_login_install_yamls_artifacts_slurp['content'] | b64decode | from_yaml | combine( { 'cifmw_install_yamls_environment': { 'KUBECONFIG': cifmw_openshift_login_kubeconfig } }, recursive=true) | to_nice_yaml }}, dest={{ cifmw_basedir }}/artifacts/parameters/install-yamls-params.yml, mode=0600] *** 2026-01-22 16:15:50,637 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:50 +0000 (0:00:00.260) 0:01:40.740 ****** 2026-01-22 16:15:50,637 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:50 +0000 (0:00:00.261) 0:01:40.739 ****** 2026-01-22 16:15:51,138 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:51,152 p=31411 u=zuul n=ansible | TASK [openshift_setup : Ensure output directory exists path={{ cifmw_openshift_setup_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:15:51,152 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.515) 0:01:41.256 ****** 2026-01-22 16:15:51,152 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.515) 0:01:41.255 ****** 2026-01-22 16:15:51,363 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:51,371 p=31411 u=zuul n=ansible | TASK [openshift_setup : Fetch namespaces to create cifmw_openshift_setup_namespaces={{ (( ([cifmw_install_yamls_defaults['NAMESPACE']] + ([cifmw_install_yamls_defaults['OPERATOR_NAMESPACE']] if 'OPERATOR_NAMESPACE' is in cifmw_install_yamls_defaults else []) ) if cifmw_install_yamls_defaults is defined else [] ) + cifmw_openshift_setup_create_namespaces) | unique }}] *** 2026-01-22 16:15:51,372 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.219) 0:01:41.476 ****** 2026-01-22 16:15:51,372 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.219) 0:01:41.474 ****** 2026-01-22 16:15:51,416 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:51,435 p=31411 u=zuul n=ansible | TASK [openshift_setup : Create required namespaces kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, name={{ item }}, kind=Namespace, state=present] *** 2026-01-22 16:15:51,435 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.063) 0:01:41.539 ****** 2026-01-22 16:15:51,435 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:51 +0000 (0:00:00.063) 0:01:41.537 ****** 2026-01-22 16:15:52,382 p=31411 u=zuul n=ansible | changed: [localhost] => (item=openstack) 2026-01-22 16:15:53,180 p=31411 u=zuul n=ansible | changed: [localhost] => (item=openstack-operators) 2026-01-22 16:15:53,192 p=31411 u=zuul n=ansible | TASK [openshift_setup : Get internal OpenShift registry route kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, kind=Route, name=default-route, namespace=openshift-image-registry] *** 2026-01-22 16:15:53,192 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:01.756) 0:01:43.296 ****** 2026-01-22 16:15:53,192 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:01.756) 0:01:43.294 ****** 2026-01-22 16:15:53,206 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,214 p=31411 u=zuul n=ansible | TASK [openshift_setup : Allow anonymous image-pulls in CRC registry for targeted namespaces state=present, kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, definition={'kind': 'RoleBinding', 'apiVersion': 'rbac.authorization.k8s.io/v1', 'metadata': {'name': 'system:image-puller', 'namespace': '{{ item }}'}, 'subjects': [{'kind': 'User', 'name': 'system:anonymous'}, {'kind': 'User', 'name': 'system:unauthenticated'}], 'roleRef': {'kind': 'ClusterRole', 'name': 'system:image-puller'}}] *** 2026-01-22 16:15:53,215 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.022) 0:01:43.318 ****** 2026-01-22 16:15:53,215 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.022) 0:01:43.317 ****** 2026-01-22 16:15:53,234 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=openstack) 2026-01-22 16:15:53,235 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=openstack-operators) 2026-01-22 16:15:53,236 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,243 p=31411 u=zuul n=ansible | TASK [openshift_setup : Wait for the image registry to be ready kind=Deployment, name=image-registry, namespace=openshift-image-registry, kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, wait=True, wait_sleep=10, wait_timeout=600, wait_condition={'type': 'Available', 'status': 'True'}] *** 2026-01-22 16:15:53,243 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.028) 0:01:43.347 ****** 2026-01-22 16:15:53,243 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.028) 0:01:43.346 ****** 2026-01-22 16:15:53,264 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,272 p=31411 u=zuul n=ansible | TASK [openshift_setup : Login into OpenShift internal registry output_dir={{ cifmw_openshift_setup_basedir }}/artifacts, script=podman login -u {{ cifmw_openshift_user }} -p {{ cifmw_openshift_token }} {%- if cifmw_openshift_setup_skip_internal_registry_tls_verify|bool %} --tls-verify=false {%- endif %} {{ cifmw_openshift_setup_registry_default_route.resources[0].spec.host }}] *** 2026-01-22 16:15:53,272 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.028) 0:01:43.376 ****** 2026-01-22 16:15:53,272 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.028) 0:01:43.374 ****** 2026-01-22 16:15:53,291 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,299 p=31411 u=zuul n=ansible | TASK [Ensure we have custom CA installed on host role=install_ca] ************** 2026-01-22 16:15:53,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.027) 0:01:43.403 ****** 2026-01-22 16:15:53,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.027) 0:01:43.402 ****** 2026-01-22 16:15:53,317 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,326 p=31411 u=zuul n=ansible | TASK [openshift_setup : Update ca bundle _raw_params=update-ca-trust extract] *** 2026-01-22 16:15:53,326 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.430 ****** 2026-01-22 16:15:53,326 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.429 ****** 2026-01-22 16:15:53,344 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,352 p=31411 u=zuul n=ansible | TASK [openshift_setup : Slurp CAs file src={{ cifmw_openshift_setup_ca_bundle_path }}] *** 2026-01-22 16:15:53,352 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.456 ****** 2026-01-22 16:15:53,352 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.455 ****** 2026-01-22 16:15:53,370 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,379 p=31411 u=zuul n=ansible | TASK [openshift_setup : Create config map with registry CAs kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, definition={'apiVersion': 'v1', 'kind': 'ConfigMap', 'metadata': {'namespace': 'openshift-config', 'name': 'registry-cas'}, 'data': '{{ _config_map_data | items2dict }}'}] *** 2026-01-22 16:15:53,379 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.483 ****** 2026-01-22 16:15:53,379 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.481 ****** 2026-01-22 16:15:53,397 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,407 p=31411 u=zuul n=ansible | TASK [openshift_setup : Install Red Hat CA for pulling images from internal registry kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, merge_type=merge, definition={'apiVersion': 'config.openshift.io/v1', 'kind': 'Image', 'metadata': {'name': 'cluster'}, 'spec': {'additionalTrustedCA': {'name': 'registry-cas'}}}] *** 2026-01-22 16:15:53,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.027) 0:01:43.511 ****** 2026-01-22 16:15:53,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.028) 0:01:43.509 ****** 2026-01-22 16:15:53,425 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,433 p=31411 u=zuul n=ansible | TASK [openshift_setup : Add insecure registry kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, merge_type=merge, definition={'apiVersion': 'config.openshift.io/v1', 'kind': 'Image', 'metadata': {'name': 'cluster'}, 'spec': {'registrySources': {'insecureRegistries': ['{{ cifmw_update_containers_registry }}'], 'allowedRegistries': '{{ all_registries }}'}}}] *** 2026-01-22 16:15:53,433 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.537 ****** 2026-01-22 16:15:53,433 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.536 ****** 2026-01-22 16:15:53,451 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,459 p=31411 u=zuul n=ansible | TASK [openshift_setup : Create a ICSP with repository digest mirrors kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, definition={'apiVersion': 'operator.openshift.io/v1alpha1', 'kind': 'ImageContentSourcePolicy', 'metadata': {'name': 'registry-digest-mirrors'}, 'spec': {'repositoryDigestMirrors': '{{ cifmw_openshift_setup_digest_mirrors }}'}}] *** 2026-01-22 16:15:53,460 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.563 ****** 2026-01-22 16:15:53,460 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.026) 0:01:43.562 ****** 2026-01-22 16:15:53,481 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:53,489 p=31411 u=zuul n=ansible | TASK [openshift_setup : Gather network.operator info kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, api_version=operator.openshift.io/v1, kind=Network, name=cluster] *** 2026-01-22 16:15:53,489 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.029) 0:01:43.593 ****** 2026-01-22 16:15:53,489 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:53 +0000 (0:00:00.029) 0:01:43.591 ****** 2026-01-22 16:15:54,353 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:15:54,364 p=31411 u=zuul n=ansible | TASK [openshift_setup : Patch network operator api_version=operator.openshift.io/v1, kubeconfig={{ cifmw_openshift_kubeconfig }}, kind=Network, name=cluster, persist_config=True, patch=[{'path': '/spec/defaultNetwork/ovnKubernetesConfig/gatewayConfig/routingViaHost', 'value': True, 'op': 'replace'}, {'path': '/spec/defaultNetwork/ovnKubernetesConfig/gatewayConfig/ipForwarding', 'value': 'Global', 'op': 'replace'}]] *** 2026-01-22 16:15:54,364 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:54 +0000 (0:00:00.875) 0:01:44.468 ****** 2026-01-22 16:15:54,364 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:54 +0000 (0:00:00.875) 0:01:44.467 ****** 2026-01-22 16:15:55,282 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:55,300 p=31411 u=zuul n=ansible | TASK [openshift_setup : Patch samples registry configuration kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, api_version=samples.operator.openshift.io/v1, kind=Config, name=cluster, patch=[{'op': 'replace', 'path': '/spec/samplesRegistry', 'value': 'registry.redhat.io'}]] *** 2026-01-22 16:15:55,300 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:55 +0000 (0:00:00.935) 0:01:45.404 ****** 2026-01-22 16:15:55,300 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:55 +0000 (0:00:00.935) 0:01:45.403 ****** 2026-01-22 16:15:56,032 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:56,049 p=31411 u=zuul n=ansible | TASK [openshift_setup : Delete the pods from openshift-marketplace namespace kind=Pod, state=absent, delete_all=True, kubeconfig={{ cifmw_openshift_kubeconfig }}, namespace=openshift-marketplace] *** 2026-01-22 16:15:56,049 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.748) 0:01:46.153 ****** 2026-01-22 16:15:56,049 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.748) 0:01:46.152 ****** 2026-01-22 16:15:56,069 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:56,085 p=31411 u=zuul n=ansible | TASK [openshift_setup : Wait for openshift-marketplace pods to be running _raw_params=oc wait pod --all --for=condition=Ready -n openshift-marketplace --timeout=1m] *** 2026-01-22 16:15:56,086 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.036) 0:01:46.189 ****** 2026-01-22 16:15:56,086 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.036) 0:01:46.188 ****** 2026-01-22 16:15:56,104 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:56,123 p=31411 u=zuul n=ansible | TASK [Deploy Observability operator. name=openshift_obs] *********************** 2026-01-22 16:15:56,123 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.037) 0:01:46.227 ****** 2026-01-22 16:15:56,123 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.037) 0:01:46.226 ****** 2026-01-22 16:15:56,145 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:56,155 p=31411 u=zuul n=ansible | TASK [Deploy Metal3 BMHs name=deploy_bmh] ************************************** 2026-01-22 16:15:56,155 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.032) 0:01:46.259 ****** 2026-01-22 16:15:56,155 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.032) 0:01:46.258 ****** 2026-01-22 16:15:56,179 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:15:56,189 p=31411 u=zuul n=ansible | TASK [Install certmanager operator role name=cert_manager] ********************* 2026-01-22 16:15:56,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.033) 0:01:46.293 ****** 2026-01-22 16:15:56,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.033) 0:01:46.292 ****** 2026-01-22 16:15:56,288 p=31411 u=zuul n=ansible | TASK [cert_manager : Create role needed directories path={{ cifmw_cert_manager_manifests_dir }}, state=directory, mode=0755] *** 2026-01-22 16:15:56,288 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.098) 0:01:46.392 ****** 2026-01-22 16:15:56,288 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.098) 0:01:46.391 ****** 2026-01-22 16:15:56,482 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:56,490 p=31411 u=zuul n=ansible | TASK [cert_manager : Create the cifmw_cert_manager_operator_namespace namespace" kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, name={{ cifmw_cert_manager_operator_namespace }}, kind=Namespace, state=present] *** 2026-01-22 16:15:56,490 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.202) 0:01:46.594 ****** 2026-01-22 16:15:56,490 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:56 +0000 (0:00:00.202) 0:01:46.593 ****** 2026-01-22 16:15:57,178 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:57,188 p=31411 u=zuul n=ansible | TASK [cert_manager : Install from Release Manifest _raw_params=release_manifest.yml] *** 2026-01-22 16:15:57,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.698) 0:01:47.292 ****** 2026-01-22 16:15:57,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.698) 0:01:47.291 ****** 2026-01-22 16:15:57,221 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/cert_manager/tasks/release_manifest.yml for localhost 2026-01-22 16:15:57,241 p=31411 u=zuul n=ansible | TASK [cert_manager : Download release manifests url={{ cifmw_cert_manager_release_manifest }}, dest={{ cifmw_cert_manager_manifests_dir }}/cert_manager_manifest.yml, mode=0664] *** 2026-01-22 16:15:57,241 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.052) 0:01:47.345 ****** 2026-01-22 16:15:57,241 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.052) 0:01:47.343 ****** 2026-01-22 16:15:57,920 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:15:57,936 p=31411 u=zuul n=ansible | TASK [cert_manager : Install cert-manager from release manifest kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, state=present, src={{ cifmw_cert_manager_manifests_dir }}/cert_manager_manifest.yml] *** 2026-01-22 16:15:57,936 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.695) 0:01:48.040 ****** 2026-01-22 16:15:57,936 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:15:57 +0000 (0:00:00.695) 0:01:48.039 ****** 2026-01-22 16:16:01,405 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:01,461 p=31411 u=zuul n=ansible | TASK [cert_manager : Install from OLM Manifest _raw_params=olm_manifest.yml] *** 2026-01-22 16:16:01,461 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:01 +0000 (0:00:03.524) 0:01:51.565 ****** 2026-01-22 16:16:01,461 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:01 +0000 (0:00:03.524) 0:01:51.563 ****** 2026-01-22 16:16:01,474 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:01,485 p=31411 u=zuul n=ansible | TASK [cert_manager : Check for cert-manager namspeace existance kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, name=cert-manager, kind=Namespace, field_selectors=['status.phase=Active']] *** 2026-01-22 16:16:01,485 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:01 +0000 (0:00:00.023) 0:01:51.589 ****** 2026-01-22 16:16:01,485 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:01 +0000 (0:00:00.023) 0:01:51.587 ****** 2026-01-22 16:16:02,182 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:02,191 p=31411 u=zuul n=ansible | TASK [cert_manager : Wait for cert-manager pods to be ready kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, namespace=cert-manager, kind=Pod, wait=True, wait_sleep=10, wait_timeout=600, wait_condition={'type': 'Ready', 'status': 'True'}, label_selectors=['app = {{ item }}']] *** 2026-01-22 16:16:02,191 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:02 +0000 (0:00:00.706) 0:01:52.295 ****** 2026-01-22 16:16:02,191 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:02 +0000 (0:00:00.706) 0:01:52.294 ****** 2026-01-22 16:16:12,989 p=31411 u=zuul n=ansible | ok: [localhost] => (item=cainjector) 2026-01-22 16:16:13,717 p=31411 u=zuul n=ansible | ok: [localhost] => (item=webhook) 2026-01-22 16:16:14,424 p=31411 u=zuul n=ansible | ok: [localhost] => (item=cert-manager) 2026-01-22 16:16:14,458 p=31411 u=zuul n=ansible | TASK [cert_manager : Create $HOME/bin dir path={{ ansible_user_dir }}/bin, state=directory, mode=0755] *** 2026-01-22 16:16:14,458 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:14 +0000 (0:00:12.266) 0:02:04.562 ****** 2026-01-22 16:16:14,458 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:14 +0000 (0:00:12.267) 0:02:04.561 ****** 2026-01-22 16:16:14,638 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:14,649 p=31411 u=zuul n=ansible | TASK [cert_manager : Install cert-manager cmctl CLI url=https://github.com/cert-manager/cmctl/releases/{{ cifmw_cert_manager_version }}/download/cmctl_{{ _os }}_{{ _arch }}, dest={{ ansible_user_dir }}/bin/cmctl, mode=0755] *** 2026-01-22 16:16:14,649 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:14 +0000 (0:00:00.190) 0:02:04.753 ****** 2026-01-22 16:16:14,649 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:14 +0000 (0:00:00.190) 0:02:04.751 ****** 2026-01-22 16:16:15,906 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:15,916 p=31411 u=zuul n=ansible | TASK [cert_manager : Verify cert_manager api _raw_params={{ ansible_user_dir }}/bin/cmctl check api --wait=2m] *** 2026-01-22 16:16:15,916 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:15 +0000 (0:00:01.267) 0:02:06.020 ****** 2026-01-22 16:16:15,916 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:15 +0000 (0:00:01.267) 0:02:06.019 ****** 2026-01-22 16:16:16,257 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:16,271 p=31411 u=zuul n=ansible | TASK [Configure hosts networking using nmstate name=ci_nmstate] **************** 2026-01-22 16:16:16,271 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.354) 0:02:06.375 ****** 2026-01-22 16:16:16,271 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.354) 0:02:06.374 ****** 2026-01-22 16:16:16,291 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,299 p=31411 u=zuul n=ansible | TASK [Configure multus networks name=ci_multus] ******************************** 2026-01-22 16:16:16,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.027) 0:02:06.403 ****** 2026-01-22 16:16:16,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.027) 0:02:06.402 ****** 2026-01-22 16:16:16,314 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,322 p=31411 u=zuul n=ansible | TASK [Deploy Sushy Emulator service pod name=sushy_emulator] ******************* 2026-01-22 16:16:16,323 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.023) 0:02:06.426 ****** 2026-01-22 16:16:16,323 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.023) 0:02:06.425 ****** 2026-01-22 16:16:16,339 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,347 p=31411 u=zuul n=ansible | TASK [Setup Libvirt on controller name=libvirt_manager] ************************ 2026-01-22 16:16:16,347 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.024) 0:02:06.451 ****** 2026-01-22 16:16:16,348 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.024) 0:02:06.450 ****** 2026-01-22 16:16:16,364 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,372 p=31411 u=zuul n=ansible | TASK [Prepare container package builder name=pkg_build] ************************ 2026-01-22 16:16:16,372 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.024) 0:02:06.476 ****** 2026-01-22 16:16:16,372 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.024) 0:02:06.474 ****** 2026-01-22 16:16:16,394 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,402 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:16,403 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.030) 0:02:06.506 ****** 2026-01-22 16:16:16,403 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.030) 0:02:06.505 ****** 2026-01-22 16:16:16,457 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:16,465 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:16,465 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.062) 0:02:06.569 ****** 2026-01-22 16:16:16,465 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.062) 0:02:06.567 ****** 2026-01-22 16:16:16,537 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:16,549 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_infra _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:16,550 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.084) 0:02:06.654 ****** 2026-01-22 16:16:16,550 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.084) 0:02:06.652 ****** 2026-01-22 16:16:16,656 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/run_hook/tasks/playbook.yml for localhost => (item={'name': 'Fetch nodes facts and save them as parameters', 'type': 'playbook', 'inventory': '/home/zuul/ci-framework-data/artifacts/zuul_inventory.yml', 'source': 'fetch_compute_facts.yml'}) 2026-01-22 16:16:16,670 p=31411 u=zuul n=ansible | TASK [run_hook : Set playbook path for Fetch nodes facts and save them as parameters cifmw_basedir={{ _bdir }}, hook_name={{ _hook_name }}, playbook_path={{ _play | realpath }}, log_path={{ _bdir }}/logs/{{ step }}_{{ _hook_name }}.log, extra_vars=-e namespace={{ cifmw_openstack_namespace }} {%- if hook.extra_vars is defined and hook.extra_vars|length > 0 -%} {% for key,value in hook.extra_vars.items() -%} {%- if key == 'file' %} -e "@{{ value }}" {%- else %} -e "{{ key }}={{ value }}" {%- endif %} {%- endfor %} {%- endif %}] *** 2026-01-22 16:16:16,670 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.120) 0:02:06.774 ****** 2026-01-22 16:16:16,671 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.120) 0:02:06.773 ****** 2026-01-22 16:16:16,714 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:16,725 p=31411 u=zuul n=ansible | TASK [run_hook : Get file stat path={{ playbook_path }}] *********************** 2026-01-22 16:16:16,725 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.054) 0:02:06.829 ****** 2026-01-22 16:16:16,725 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.054) 0:02:06.828 ****** 2026-01-22 16:16:16,927 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:16,939 p=31411 u=zuul n=ansible | TASK [run_hook : Fail if playbook doesn't exist msg=Playbook {{ playbook_path }} doesn't seem to exist.] *** 2026-01-22 16:16:16,939 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.214) 0:02:07.043 ****** 2026-01-22 16:16:16,939 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.214) 0:02:07.042 ****** 2026-01-22 16:16:16,956 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:16,966 p=31411 u=zuul n=ansible | TASK [run_hook : Get parameters files paths={{ (cifmw_basedir, 'artifacts/parameters') | path_join }}, file_type=file, patterns=*.yml] *** 2026-01-22 16:16:16,966 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.027) 0:02:07.070 ****** 2026-01-22 16:16:16,967 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:16 +0000 (0:00:00.027) 0:02:07.069 ****** 2026-01-22 16:16:17,151 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:17,161 p=31411 u=zuul n=ansible | TASK [run_hook : Add parameters artifacts as extra variables extra_vars={{ extra_vars }} {% for file in cifmw_run_hook_parameters_files.files %} -e "@{{ file.path }}" {%- endfor %}] *** 2026-01-22 16:16:17,161 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.194) 0:02:07.265 ****** 2026-01-22 16:16:17,161 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.194) 0:02:07.264 ****** 2026-01-22 16:16:17,180 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:17,189 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure log directory exists path={{ log_path | dirname }}, state=directory, mode=0755] *** 2026-01-22 16:16:17,190 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.028) 0:02:07.294 ****** 2026-01-22 16:16:17,190 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.028) 0:02:07.292 ****** 2026-01-22 16:16:17,373 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:17,382 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure artifacts directory exists path={{ cifmw_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:16:17,382 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.192) 0:02:07.486 ****** 2026-01-22 16:16:17,382 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.192) 0:02:07.484 ****** 2026-01-22 16:16:17,566 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:17,582 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook without retry - Fetch nodes facts and save them as parameters] *** 2026-01-22 16:16:17,582 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.199) 0:02:07.686 ****** 2026-01-22 16:16:17,582 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:17 +0000 (0:00:00.199) 0:02:07.684 ****** 2026-01-22 16:16:17,640 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_002_run_hook_without_retry_fetch.log 2026-01-22 16:16:27,499 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:27,508 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook with retry - Fetch nodes facts and save them as parameters] *** 2026-01-22 16:16:27,508 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:09.925) 0:02:17.612 ****** 2026-01-22 16:16:27,508 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:09.925) 0:02:17.610 ****** 2026-01-22 16:16:27,527 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:27,535 p=31411 u=zuul n=ansible | TASK [run_hook : Check if we have a file path={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:16:27,535 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.027) 0:02:17.639 ****** 2026-01-22 16:16:27,535 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.027) 0:02:17.638 ****** 2026-01-22 16:16:27,746 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:27,767 p=31411 u=zuul n=ansible | TASK [run_hook : Load generated content in main playbook file={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:16:27,767 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.232) 0:02:17.871 ****** 2026-01-22 16:16:27,767 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.232) 0:02:17.870 ****** 2026-01-22 16:16:27,794 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:27,816 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:27,816 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.049) 0:02:17.920 ****** 2026-01-22 16:16:27,816 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.048) 0:02:17.919 ****** 2026-01-22 16:16:27,880 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:27,888 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:27,888 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.071) 0:02:17.992 ****** 2026-01-22 16:16:27,888 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:27 +0000 (0:00:00.071) 0:02:17.991 ****** 2026-01-22 16:16:28,009 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,018 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_package_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:28,019 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.130) 0:02:18.123 ****** 2026-01-22 16:16:28,019 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.130) 0:02:18.121 ****** 2026-01-22 16:16:28,138 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:28,151 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:16:28,152 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.132) 0:02:18.255 ****** 2026-01-22 16:16:28,152 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.132) 0:02:18.254 ****** 2026-01-22 16:16:28,242 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,251 p=31411 u=zuul n=ansible | TASK [pkg_build : Generate volume list build_volumes={% for pkg in cifmw_pkg_build_list -%} - "{{ pkg.src|default(cifmw_pkg_build_pkg_basedir ~ '/' ~ pkg.name) }}:/root/src/{{ pkg.name }}:z" - "{{ cifmw_pkg_build_basedir }}/volumes/packages/{{ pkg.name }}:/root/{{ pkg.name }}:z" - "{{ cifmw_pkg_build_basedir }}/logs/build_{{ pkg.name }}:/root/logs:z" {% endfor -%} - "{{ cifmw_pkg_build_basedir }}/volumes/packages/gating_repo:/root/gating_repo:z" - "{{ cifmw_pkg_build_basedir }}/artifacts/repositories:/root/yum.repos.d:z,ro" - "{{ cifmw_pkg_build_basedir }}/artifacts/build-packages.yml:/root/playbook.yml:z,ro" ] *** 2026-01-22 16:16:28,252 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.100) 0:02:18.356 ****** 2026-01-22 16:16:28,252 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.100) 0:02:18.354 ****** 2026-01-22 16:16:28,317 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:28,324 p=31411 u=zuul n=ansible | TASK [pkg_build : Build package using container name={{ pkg.name }}-builder, auto_remove=True, detach=False, privileged=True, log_driver=k8s-file, log_level=info, log_opt={'path': '{{ cifmw_pkg_build_basedir }}/logs/{{ pkg.name }}-builder.log'}, image={{ cifmw_pkg_build_ctx_name }}, volume={{ build_volumes | from_yaml }}, security_opt=['label=disable', 'seccomp=unconfined', 'apparmor=unconfined'], env={'PROJECT': '{{ pkg.name }}'}, command=ansible-playbook -i localhost, -c local playbook.yml] *** 2026-01-22 16:16:28,324 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.072) 0:02:18.428 ****** 2026-01-22 16:16:28,324 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.072) 0:02:18.426 ****** 2026-01-22 16:16:28,336 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:28,350 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:28,350 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.025) 0:02:18.454 ****** 2026-01-22 16:16:28,350 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.025) 0:02:18.452 ****** 2026-01-22 16:16:28,400 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,407 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:28,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.057) 0:02:18.511 ****** 2026-01-22 16:16:28,407 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.057) 0:02:18.510 ****** 2026-01-22 16:16:28,481 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,494 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_package_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:28,494 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.086) 0:02:18.598 ****** 2026-01-22 16:16:28,494 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.087) 0:02:18.597 ****** 2026-01-22 16:16:28,580 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:28,608 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:28,608 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.113) 0:02:18.712 ****** 2026-01-22 16:16:28,608 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.113) 0:02:18.711 ****** 2026-01-22 16:16:28,661 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,672 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:28,672 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.064) 0:02:18.776 ****** 2026-01-22 16:16:28,673 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.064) 0:02:18.775 ****** 2026-01-22 16:16:28,750 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,762 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_container_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:28,762 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.089) 0:02:18.866 ****** 2026-01-22 16:16:28,762 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.089) 0:02:18.865 ****** 2026-01-22 16:16:28,839 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:28,857 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:16:28,857 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.095) 0:02:18.961 ****** 2026-01-22 16:16:28,857 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.095) 0:02:18.960 ****** 2026-01-22 16:16:28,906 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:28,918 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Nothing to do yet msg=No support for that step yet] ******** 2026-01-22 16:16:28,918 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.060) 0:02:19.022 ****** 2026-01-22 16:16:28,918 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.060) 0:02:19.021 ****** 2026-01-22 16:16:28,935 p=31411 u=zuul n=ansible | ok: [localhost] => msg: No support for that step yet 2026-01-22 16:16:28,946 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:28,947 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.028) 0:02:19.051 ****** 2026-01-22 16:16:28,947 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:28 +0000 (0:00:00.028) 0:02:19.049 ****** 2026-01-22 16:16:29,008 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:29,019 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:29,019 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.072) 0:02:19.123 ****** 2026-01-22 16:16:29,019 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.072) 0:02:19.122 ****** 2026-01-22 16:16:29,093 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:29,106 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_container_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:29,107 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.087) 0:02:19.211 ****** 2026-01-22 16:16:29,107 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.087) 0:02:19.209 ****** 2026-01-22 16:16:29,181 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,211 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:29,212 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.104) 0:02:19.316 ****** 2026-01-22 16:16:29,212 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.104) 0:02:19.314 ****** 2026-01-22 16:16:29,274 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:29,283 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:29,283 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.070) 0:02:19.387 ****** 2026-01-22 16:16:29,283 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.070) 0:02:19.385 ****** 2026-01-22 16:16:29,361 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:29,369 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_operator_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:29,369 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.086) 0:02:19.473 ****** 2026-01-22 16:16:29,369 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.086) 0:02:19.472 ****** 2026-01-22 16:16:29,444 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,459 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:16:29,459 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.090) 0:02:19.563 ****** 2026-01-22 16:16:29,460 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.090) 0:02:19.562 ****** 2026-01-22 16:16:29,502 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:29,511 p=31411 u=zuul n=ansible | TASK [operator_build : Ensure mandatory directories exist path={{ cifmw_operator_build_basedir }}/{{ item }}, state=directory, mode=0755] *** 2026-01-22 16:16:29,511 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.051) 0:02:19.615 ****** 2026-01-22 16:16:29,511 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.051) 0:02:19.614 ****** 2026-01-22 16:16:29,541 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=artifacts) 2026-01-22 16:16:29,547 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=logs) 2026-01-22 16:16:29,547 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,557 p=31411 u=zuul n=ansible | TASK [operator_build : Initialize role output cifmw_operator_build_output={{ cifmw_operator_build_output }}, cifmw_operator_build_meta_name={{ cifmw_operator_build_meta_name }}] *** 2026-01-22 16:16:29,557 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.045) 0:02:19.661 ****** 2026-01-22 16:16:29,557 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.045) 0:02:19.659 ****** 2026-01-22 16:16:29,581 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,590 p=31411 u=zuul n=ansible | TASK [operator_build : Populate operators list with zuul info _raw_params=zuul_info.yml] *** 2026-01-22 16:16:29,590 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.033) 0:02:19.694 ****** 2026-01-22 16:16:29,590 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.033) 0:02:19.693 ****** 2026-01-22 16:16:29,614 p=31411 u=zuul n=ansible | skipping: [localhost] => (item={'branch': 'main', 'change': '577', 'change_url': 'https://github.com/openstack-k8s-operators/neutron-operator/pull/577', 'commit_id': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'patchset': 'ee35d45e2b7d36f4be2f39423542f378a292b175', 'project': {'canonical_hostname': 'github.com', 'canonical_name': 'github.com/openstack-k8s-operators/neutron-operator', 'name': 'openstack-k8s-operators/neutron-operator', 'short_name': 'neutron-operator', 'src_dir': 'src/github.com/openstack-k8s-operators/neutron-operator'}, 'topic': None}) 2026-01-22 16:16:29,616 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,624 p=31411 u=zuul n=ansible | TASK [operator_build : Merge lists of operators operators_list={{ [cifmw_operator_build_operators, zuul_info_operators | default([])] | community.general.lists_mergeby('name') }}] *** 2026-01-22 16:16:29,624 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.034) 0:02:19.728 ****** 2026-01-22 16:16:29,624 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.034) 0:02:19.727 ****** 2026-01-22 16:16:29,646 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:29,655 p=31411 u=zuul n=ansible | TASK [operator_build : Get meta_operator src dir from operators_list cifmw_operator_build_meta_src={{ (operators_list | selectattr('name', 'eq', cifmw_operator_build_meta_name) | map(attribute='src') | first ) | default(cifmw_operator_build_meta_src, true) }}] *** 2026-01-22 16:16:29,655 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.030) 0:02:19.759 ****** 2026-01-22 16:16:29,655 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:29 +0000 (0:00:00.030) 0:02:19.757 ****** 2026-01-22 16:16:30,690 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,699 p=31411 u=zuul n=ansible | TASK [operator_build : Adds meta-operator to the list operators_list={{ [operators_list, meta_operator_info] | community.general.lists_mergeby('name') }}] *** 2026-01-22 16:16:30,699 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:01.043) 0:02:20.803 ****** 2026-01-22 16:16:30,699 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:01.043) 0:02:20.801 ****** 2026-01-22 16:16:30,727 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,734 p=31411 u=zuul n=ansible | TASK [operator_build : Clone operator's code when src dir is empty _raw_params=clone.yml] *** 2026-01-22 16:16:30,734 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.035) 0:02:20.838 ****** 2026-01-22 16:16:30,734 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.035) 0:02:20.837 ****** 2026-01-22 16:16:30,758 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,767 p=31411 u=zuul n=ansible | TASK [operator_build : Building operators _raw_params=build.yml] *************** 2026-01-22 16:16:30,767 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.032) 0:02:20.871 ****** 2026-01-22 16:16:30,767 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.032) 0:02:20.869 ****** 2026-01-22 16:16:30,788 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,796 p=31411 u=zuul n=ansible | TASK [operator_build : Building meta operator _raw_params=build.yml] *********** 2026-01-22 16:16:30,796 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.029) 0:02:20.900 ****** 2026-01-22 16:16:30,796 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.029) 0:02:20.899 ****** 2026-01-22 16:16:30,817 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,826 p=31411 u=zuul n=ansible | TASK [operator_build : Gather role output dest={{ cifmw_operator_build_basedir }}/artifacts/custom-operators.yml, content={{ cifmw_operator_build_output | to_nice_yaml }}, mode=0644] *** 2026-01-22 16:16:30,826 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.030) 0:02:20.930 ****** 2026-01-22 16:16:30,826 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.030) 0:02:20.929 ****** 2026-01-22 16:16:30,847 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:30,861 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:30,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.034) 0:02:20.965 ****** 2026-01-22 16:16:30,861 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.034) 0:02:20.964 ****** 2026-01-22 16:16:30,911 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:30,920 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:30,920 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.058) 0:02:21.024 ****** 2026-01-22 16:16:30,920 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:30 +0000 (0:00:00.058) 0:02:21.023 ****** 2026-01-22 16:16:30,994 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,003 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_operator_build _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:31,003 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.082) 0:02:21.107 ****** 2026-01-22 16:16:31,003 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.082) 0:02:21.106 ****** 2026-01-22 16:16:31,092 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:31,111 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:16:31,111 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.107) 0:02:21.215 ****** 2026-01-22 16:16:31,111 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.107) 0:02:21.213 ****** 2026-01-22 16:16:31,166 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,173 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:16:31,174 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.062) 0:02:21.277 ****** 2026-01-22 16:16:31,174 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.062) 0:02:21.276 ****** 2026-01-22 16:16:31,249 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,258 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_deploy _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:16:31,258 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.084) 0:02:21.362 ****** 2026-01-22 16:16:31,258 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.084) 0:02:21.361 ****** 2026-01-22 16:16:31,359 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/run_hook/tasks/playbook.yml for localhost => (item={'name': '80 Kustomize OpenStack CR', 'type': 'playbook', 'source': 'control_plane_horizon.yml'}) 2026-01-22 16:16:31,370 p=31411 u=zuul n=ansible | TASK [run_hook : Set playbook path for 80 Kustomize OpenStack CR cifmw_basedir={{ _bdir }}, hook_name={{ _hook_name }}, playbook_path={{ _play | realpath }}, log_path={{ _bdir }}/logs/{{ step }}_{{ _hook_name }}.log, extra_vars=-e namespace={{ cifmw_openstack_namespace }} {%- if hook.extra_vars is defined and hook.extra_vars|length > 0 -%} {% for key,value in hook.extra_vars.items() -%} {%- if key == 'file' %} -e "@{{ value }}" {%- else %} -e "{{ key }}={{ value }}" {%- endif %} {%- endfor %} {%- endif %}] *** 2026-01-22 16:16:31,371 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.112) 0:02:21.475 ****** 2026-01-22 16:16:31,371 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.112) 0:02:21.473 ****** 2026-01-22 16:16:31,409 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,418 p=31411 u=zuul n=ansible | TASK [run_hook : Get file stat path={{ playbook_path }}] *********************** 2026-01-22 16:16:31,418 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.047) 0:02:21.522 ****** 2026-01-22 16:16:31,418 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.047) 0:02:21.521 ****** 2026-01-22 16:16:31,613 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,621 p=31411 u=zuul n=ansible | TASK [run_hook : Fail if playbook doesn't exist msg=Playbook {{ playbook_path }} doesn't seem to exist.] *** 2026-01-22 16:16:31,622 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.203) 0:02:21.725 ****** 2026-01-22 16:16:31,622 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.203) 0:02:21.724 ****** 2026-01-22 16:16:31,640 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:31,647 p=31411 u=zuul n=ansible | TASK [run_hook : Get parameters files paths={{ (cifmw_basedir, 'artifacts/parameters') | path_join }}, file_type=file, patterns=*.yml] *** 2026-01-22 16:16:31,648 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.026) 0:02:21.752 ****** 2026-01-22 16:16:31,648 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.026) 0:02:21.750 ****** 2026-01-22 16:16:31,823 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,832 p=31411 u=zuul n=ansible | TASK [run_hook : Add parameters artifacts as extra variables extra_vars={{ extra_vars }} {% for file in cifmw_run_hook_parameters_files.files %} -e "@{{ file.path }}" {%- endfor %}] *** 2026-01-22 16:16:31,832 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.184) 0:02:21.936 ****** 2026-01-22 16:16:31,832 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.184) 0:02:21.935 ****** 2026-01-22 16:16:31,855 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:31,864 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure log directory exists path={{ log_path | dirname }}, state=directory, mode=0755] *** 2026-01-22 16:16:31,864 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.032) 0:02:21.968 ****** 2026-01-22 16:16:31,864 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:31 +0000 (0:00:00.032) 0:02:21.967 ****** 2026-01-22 16:16:32,045 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:32,053 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure artifacts directory exists path={{ cifmw_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:16:32,053 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:32 +0000 (0:00:00.189) 0:02:22.157 ****** 2026-01-22 16:16:32,053 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:32 +0000 (0:00:00.189) 0:02:22.156 ****** 2026-01-22 16:16:32,272 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:32,289 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook without retry - 80 Kustomize OpenStack CR] *********** 2026-01-22 16:16:32,289 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:32 +0000 (0:00:00.235) 0:02:22.393 ****** 2026-01-22 16:16:32,290 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:32 +0000 (0:00:00.235) 0:02:22.392 ****** 2026-01-22 16:16:32,341 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_003_run_hook_without_retry_80.log 2026-01-22 16:16:33,983 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:33,994 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook with retry - 80 Kustomize OpenStack CR] ************** 2026-01-22 16:16:33,994 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:33 +0000 (0:00:01.705) 0:02:24.098 ****** 2026-01-22 16:16:33,995 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:33 +0000 (0:00:01.704) 0:02:24.097 ****** 2026-01-22 16:16:34,018 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:34,026 p=31411 u=zuul n=ansible | TASK [run_hook : Check if we have a file path={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:16:34,027 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.032) 0:02:24.131 ****** 2026-01-22 16:16:34,027 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.032) 0:02:24.129 ****** 2026-01-22 16:16:34,218 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:34,226 p=31411 u=zuul n=ansible | TASK [run_hook : Load generated content in main playbook file={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:16:34,226 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.199) 0:02:24.330 ****** 2026-01-22 16:16:34,226 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.199) 0:02:24.329 ****** 2026-01-22 16:16:34,247 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:34,261 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:16:34,261 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.034) 0:02:24.365 ****** 2026-01-22 16:16:34,261 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.034) 0:02:24.363 ****** 2026-01-22 16:16:34,306 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:34,315 p=31411 u=zuul n=ansible | TASK [Configure Storage Class name=ci_local_storage] *************************** 2026-01-22 16:16:34,315 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.054) 0:02:24.419 ****** 2026-01-22 16:16:34,315 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.054) 0:02:24.418 ****** 2026-01-22 16:16:34,422 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Create role needed directories path={{ cifmw_cls_manifests_dir }}, state=directory, mode=0755] *** 2026-01-22 16:16:34,422 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.107) 0:02:24.526 ****** 2026-01-22 16:16:34,422 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.107) 0:02:24.525 ****** 2026-01-22 16:16:34,609 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:34,616 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Create the cifmw_cls_namespace namespace" kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit) }}, name={{ cifmw_cls_namespace }}, kind=Namespace, state=present] *** 2026-01-22 16:16:34,616 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.194) 0:02:24.720 ****** 2026-01-22 16:16:34,617 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:34 +0000 (0:00:00.194) 0:02:24.719 ****** 2026-01-22 16:16:35,355 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:35,365 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Save storage manifests as artifacts dest={{ cifmw_cls_manifests_dir }}/storage-class.yaml, content={{ cifmw_cls_storage_manifest | to_nice_yaml }}, mode=0644] *** 2026-01-22 16:16:35,365 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:35 +0000 (0:00:00.748) 0:02:25.469 ****** 2026-01-22 16:16:35,365 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:35 +0000 (0:00:00.748) 0:02:25.468 ****** 2026-01-22 16:16:35,745 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:35,754 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Get k8s nodes kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit)}}, context={{ cifmw_openshift_context | default(omit)}}, kind=Node] *** 2026-01-22 16:16:35,754 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:35 +0000 (0:00:00.389) 0:02:25.858 ****** 2026-01-22 16:16:35,754 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:35 +0000 (0:00:00.389) 0:02:25.857 ****** 2026-01-22 16:16:36,476 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:36,487 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Fetch hostnames for all hosts _raw_params=hostname] *** 2026-01-22 16:16:36,487 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:36 +0000 (0:00:00.733) 0:02:26.591 ****** 2026-01-22 16:16:36,487 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:36 +0000 (0:00:00.733) 0:02:26.590 ****** 2026-01-22 16:16:36,746 p=31411 u=zuul n=ansible | changed: [localhost -> compute-0(38.102.83.176)] => (item=compute-0) 2026-01-22 16:16:38,779 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=crc) 2026-01-22 16:16:39,129 p=31411 u=zuul n=ansible | changed: [localhost -> controller(38.102.83.217)] => (item=controller) 2026-01-22 16:16:39,312 p=31411 u=zuul n=ansible | changed: [localhost] => (item=localhost) 2026-01-22 16:16:39,313 p=31411 u=zuul n=ansible | [WARNING]: Platform linux on host localhost is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.15/reference_appendices/interpreter_discovery.html for more information. 2026-01-22 16:16:39,321 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Set the hosts k8s ansible hosts cifmw_ci_local_storage_k8s_hosts={{ _host_map | selectattr("key", "in", k8s_nodes_hostnames) | map(attribute="value") | list }}, cifmw_ci_local_storage_k8s_hostnames={{ k8s_nodes_hostnames }}] *** 2026-01-22 16:16:39,322 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:39 +0000 (0:00:02.834) 0:02:29.426 ****** 2026-01-22 16:16:39,322 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:39 +0000 (0:00:02.834) 0:02:29.424 ****** 2026-01-22 16:16:39,356 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:39,364 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Apply the storage class manifests kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit) }}, context={{ cifmw_openshift_context | default(omit) }}, state=present, src={{ cifmw_cls_manifests_dir }}/storage-class.yaml] *** 2026-01-22 16:16:39,365 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:39 +0000 (0:00:00.042) 0:02:29.468 ****** 2026-01-22 16:16:39,365 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:39 +0000 (0:00:00.042) 0:02:29.467 ****** 2026-01-22 16:16:40,103 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:40,115 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Create directories on worker node _raw_params=worker_node_dirs.yml] *** 2026-01-22 16:16:40,115 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:40 +0000 (0:00:00.750) 0:02:30.219 ****** 2026-01-22 16:16:40,115 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:40 +0000 (0:00:00.750) 0:02:30.218 ****** 2026-01-22 16:16:40,151 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/ci_local_storage/tasks/worker_node_dirs.yml for localhost => (item=crc) 2026-01-22 16:16:40,170 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Perform action in the PV directory path={{ [ cifmw_cls_local_storage_name, 'pv'+ ("%02d" | format(item | int)) ] | path_join }}, state={{ 'directory' if cifmw_cls_action == 'create' else 'absent' }}, mode=0775] *** 2026-01-22 16:16:40,170 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:40 +0000 (0:00:00.055) 0:02:30.274 ****** 2026-01-22 16:16:40,170 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:40 +0000 (0:00:00.055) 0:02:30.273 ****** 2026-01-22 16:16:40,760 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=1) 2026-01-22 16:16:41,233 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=2) 2026-01-22 16:16:41,722 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=3) 2026-01-22 16:16:42,274 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=4) 2026-01-22 16:16:42,716 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=5) 2026-01-22 16:16:43,224 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=6) 2026-01-22 16:16:43,737 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=7) 2026-01-22 16:16:44,253 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=8) 2026-01-22 16:16:44,823 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=9) 2026-01-22 16:16:45,347 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=10) 2026-01-22 16:16:45,851 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=11) 2026-01-22 16:16:46,427 p=31411 u=zuul n=ansible | changed: [localhost -> crc(38.102.83.145)] => (item=12) 2026-01-22 16:16:46,454 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Generate pv related storage manifest file src=storage.yaml.j2, dest={{ cifmw_cls_manifests_dir }}/storage.yaml, mode=0644] *** 2026-01-22 16:16:46,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:46 +0000 (0:00:06.284) 0:02:36.559 ****** 2026-01-22 16:16:46,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:46 +0000 (0:00:06.284) 0:02:36.557 ****** 2026-01-22 16:16:46,953 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:46,961 p=31411 u=zuul n=ansible | TASK [ci_local_storage : Apply pv related storage manifest file kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit) }}, context={{ cifmw_openshift_context | default(omit) }}, state=present, src={{ cifmw_cls_manifests_dir }}/storage.yaml] *** 2026-01-22 16:16:46,961 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:46 +0000 (0:00:00.506) 0:02:37.065 ****** 2026-01-22 16:16:46,961 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:46 +0000 (0:00:00.506) 0:02:37.064 ****** 2026-01-22 16:16:47,823 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:47,841 p=31411 u=zuul n=ansible | TASK [Configure LVMS Storage Class name=ci_lvms_storage] *********************** 2026-01-22 16:16:47,841 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:47 +0000 (0:00:00.879) 0:02:37.945 ****** 2026-01-22 16:16:47,841 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:47 +0000 (0:00:00.879) 0:02:37.943 ****** 2026-01-22 16:16:47,865 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:47,877 p=31411 u=zuul n=ansible | TASK [Run edpm_prepare name=edpm_prepare] ************************************** 2026-01-22 16:16:47,877 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:47 +0000 (0:00:00.036) 0:02:37.981 ****** 2026-01-22 16:16:47,877 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:47 +0000 (0:00:00.036) 0:02:37.980 ****** 2026-01-22 16:16:48,013 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Define minimal set of repo variables when not running on Zuul _install_yamls_repos={'OPENSTACK_BRANCH': '', "GIT_CLONE_OPTS'": '-l', "OPENSTACK_REPO'": '{{ operators_build_output[cifmw_operator_build_meta_name].git_src_dir }}'}] *** 2026-01-22 16:16:48,013 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.136) 0:02:38.117 ****** 2026-01-22 16:16:48,013 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.136) 0:02:38.116 ****** 2026-01-22 16:16:48,052 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:48,060 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Set install_yamls Makefile environment variables cifmw_edpm_prepare_common_env={{ cifmw_install_yamls_environment | combine({'PATH': cifmw_path}) | combine(_install_yamls_repos | default({})) | combine(cifmw_edpm_prepare_extra_vars | default({})) }}, cifmw_edpm_prepare_make_openstack_env={% if cifmw_operator_build_meta_name is defined and cifmw_operator_build_meta_name in operators_build_output %} OPENSTACK_IMG: {{ operators_build_output[cifmw_operator_build_meta_name].image_catalog }} {% endif %} , cifmw_edpm_prepare_make_openstack_deploy_prep_env=CLEANUP_DIR_CMD: "true" , cifmw_edpm_prepare_operators_build_output={{ operators_build_output }}] *** 2026-01-22 16:16:48,060 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.046) 0:02:38.164 ****** 2026-01-22 16:16:48,060 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.046) 0:02:38.163 ****** 2026-01-22 16:16:48,092 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:16:48,100 p=31411 u=zuul n=ansible | TASK [Prepare storage in CRC name=install_yamls_makes, tasks_from=make_crc_storage] *** 2026-01-22 16:16:48,100 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.039) 0:02:38.204 ****** 2026-01-22 16:16:48,100 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.039) 0:02:38.203 ****** 2026-01-22 16:16:48,127 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:48,136 p=31411 u=zuul n=ansible | TASK [Prepare inputs name=install_yamls_makes, tasks_from=make_input] ********** 2026-01-22 16:16:48,137 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.036) 0:02:38.241 ****** 2026-01-22 16:16:48,137 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.036) 0:02:38.239 ****** 2026-01-22 16:16:48,188 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_input_env var=make_input_env] *********** 2026-01-22 16:16:48,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.052) 0:02:38.293 ****** 2026-01-22 16:16:48,189 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.052) 0:02:38.291 ****** 2026-01-22 16:16:48,215 p=31411 u=zuul n=ansible | ok: [localhost] => make_input_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NETWORK_MTU: 1500 NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NNCP_DNS_SERVER: 192.168.122.10 NNCP_INTERFACE: ens7 OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin 2026-01-22 16:16:48,224 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_input_params var=make_input_params] ***** 2026-01-22 16:16:48,224 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.035) 0:02:38.328 ****** 2026-01-22 16:16:48,224 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.035) 0:02:38.327 ****** 2026-01-22 16:16:48,245 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:48,254 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run input output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make input, dry_run={{ make_input_dryrun|default(false)|bool }}, extra_args={{ dict((make_input_env|default({})), **(make_input_params|default({}))) }}] *** 2026-01-22 16:16:48,254 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.029) 0:02:38.358 ****** 2026-01-22 16:16:48,254 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:48 +0000 (0:00:00.029) 0:02:38.356 ****** 2026-01-22 16:16:48,296 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_004_run.log 2026-01-22 16:16:49,572 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_input_until | default(true) }} 2026-01-22 16:16:49,573 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:16:49,586 p=31411 u=zuul n=ansible | TASK [OpenStack meta-operator installation name=install_yamls_makes, tasks_from=make_openstack] *** 2026-01-22 16:16:49,586 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:01.332) 0:02:39.690 ****** 2026-01-22 16:16:49,586 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:01.332) 0:02:39.689 ****** 2026-01-22 16:16:49,632 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_env var=make_openstack_env] *** 2026-01-22 16:16:49,633 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.046) 0:02:39.736 ****** 2026-01-22 16:16:49,633 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.046) 0:02:39.735 ****** 2026-01-22 16:16:49,661 p=31411 u=zuul n=ansible | ok: [localhost] => make_openstack_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NETWORK_MTU: 1500 NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NNCP_DNS_SERVER: 192.168.122.10 NNCP_INTERFACE: ens7 OPENSTACK_IMG: 38.102.83.113:5001/openstack-k8s-operators/openstack-operator-index:bc2323a91af143d35003723e61529632dd3a067a OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin 2026-01-22 16:16:49,671 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_params var=make_openstack_params] *** 2026-01-22 16:16:49,672 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.038) 0:02:39.775 ****** 2026-01-22 16:16:49,672 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.039) 0:02:39.774 ****** 2026-01-22 16:16:49,697 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:16:49,705 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run openstack output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make openstack, dry_run={{ make_openstack_dryrun|default(false)|bool }}, extra_args={{ dict((make_openstack_env|default({})), **(make_openstack_params|default({}))) }}] *** 2026-01-22 16:16:49,705 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.033) 0:02:39.809 ****** 2026-01-22 16:16:49,705 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:16:49 +0000 (0:00:00.033) 0:02:39.808 ****** 2026-01-22 16:16:49,756 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_005_run.log 2026-01-22 16:19:05,443 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_openstack_until | default(true) }} 2026-01-22 16:19:05,451 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:19:05,472 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Wait for OpenStack subscription creation _raw_params=oc get sub openstack-operator --namespace={{ cifmw_install_yamls_defaults['OPERATOR_NAMESPACE'] }} -o=jsonpath='{.status.installplan.name}'] *** 2026-01-22 16:19:05,472 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:19:05 +0000 (0:02:15.766) 0:04:55.576 ****** 2026-01-22 16:19:05,472 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:19:05 +0000 (0:02:15.766) 0:04:55.575 ****** 2026-01-22 16:20:06,637 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:20:06,644 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Wait for OpenStack operator to get installed _raw_params=oc wait InstallPlan {{ cifmw_edpm_prepare_wait_installplan_out.stdout }} --namespace={{ cifmw_install_yamls_defaults['OPERATOR_NAMESPACE'] }} --for=jsonpath='{.status.phase}'=Complete --timeout=20m] *** 2026-01-22 16:20:06,645 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:06 +0000 (0:01:01.172) 0:05:56.749 ****** 2026-01-22 16:20:06,645 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:06 +0000 (0:01:01.172) 0:05:56.747 ****** 2026-01-22 16:20:07,088 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:20:07,096 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Check if the OpenStack initialization CRD exists kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit) }}, context={{ cifmw_openshift_context | default(omit) }}, kind=CustomResourceDefinition, name=openstacks.operator.openstack.org] *** 2026-01-22 16:20:07,096 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:07 +0000 (0:00:00.451) 0:05:57.200 ****** 2026-01-22 16:20:07,096 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:07 +0000 (0:00:00.451) 0:05:57.199 ****** 2026-01-22 16:20:07,988 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:20:07,997 p=31411 u=zuul n=ansible | TASK [OpenStack meta-operator initialization, if necessary name=install_yamls_makes, tasks_from=make_openstack_init] *** 2026-01-22 16:20:07,997 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:07 +0000 (0:00:00.900) 0:05:58.101 ****** 2026-01-22 16:20:07,997 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:07 +0000 (0:00:00.900) 0:05:58.100 ****** 2026-01-22 16:20:08,051 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_init_env var=make_openstack_init_env] *** 2026-01-22 16:20:08,051 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.054) 0:05:58.155 ****** 2026-01-22 16:20:08,051 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.054) 0:05:58.154 ****** 2026-01-22 16:20:08,080 p=31411 u=zuul n=ansible | ok: [localhost] => make_openstack_init_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NETWORK_MTU: 1500 NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NNCP_DNS_SERVER: 192.168.122.10 NNCP_INTERFACE: ens7 OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin 2026-01-22 16:20:08,087 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_init_params var=make_openstack_init_params] *** 2026-01-22 16:20:08,088 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.036) 0:05:58.191 ****** 2026-01-22 16:20:08,088 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.036) 0:05:58.190 ****** 2026-01-22 16:20:08,110 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:20:08,120 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run openstack_init output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make openstack_init, dry_run={{ make_openstack_init_dryrun|default(false)|bool }}, extra_args={{ dict((make_openstack_init_env|default({})), **(make_openstack_init_params|default({}))) }}] *** 2026-01-22 16:20:08,120 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.032) 0:05:58.224 ****** 2026-01-22 16:20:08,120 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:20:08 +0000 (0:00:00.032) 0:05:58.223 ****** 2026-01-22 16:20:08,178 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_006_run_openstack.log 2026-01-22 16:21:42,453 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_openstack_init_until | default(true) }} 2026-01-22 16:21:42,455 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:21:42,471 p=31411 u=zuul n=ansible | TASK [Update OpenStack Services containers Env name=set_openstack_containers] *** 2026-01-22 16:21:42,471 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:01:34.351) 0:07:32.575 ****** 2026-01-22 16:21:42,471 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:01:34.351) 0:07:32.574 ****** 2026-01-22 16:21:42,500 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:42,509 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Set facts for baremetal UEFI image url cifmw_update_containers_edpm_image_url={{ cifmw_build_images_output['images']['edpm-hardened-uefi']['image'] }}, cacheable=True] *** 2026-01-22 16:21:42,510 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.038) 0:07:32.614 ****** 2026-01-22 16:21:42,510 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.038) 0:07:32.612 ****** 2026-01-22 16:21:42,534 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:42,543 p=31411 u=zuul n=ansible | TASK [Prepare OpenStack control plane CR name=install_yamls_makes, tasks_from=make_openstack_deploy_prep] *** 2026-01-22 16:21:42,543 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.033) 0:07:32.647 ****** 2026-01-22 16:21:42,543 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.033) 0:07:32.646 ****** 2026-01-22 16:21:42,601 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_deploy_prep_env var=make_openstack_deploy_prep_env] *** 2026-01-22 16:21:42,601 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.057) 0:07:32.705 ****** 2026-01-22 16:21:42,601 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.057) 0:07:32.704 ****** 2026-01-22 16:21:42,635 p=31411 u=zuul n=ansible | ok: [localhost] => make_openstack_deploy_prep_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' CLEANUP_DIR_CMD: 'true' INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NETWORK_MTU: 1500 NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NNCP_DNS_SERVER: 192.168.122.10 NNCP_INTERFACE: ens7 OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin 2026-01-22 16:21:42,643 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_openstack_deploy_prep_params var=make_openstack_deploy_prep_params] *** 2026-01-22 16:21:42,643 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.042) 0:07:32.747 ****** 2026-01-22 16:21:42,644 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.042) 0:07:32.746 ****** 2026-01-22 16:21:42,674 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:42,682 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run openstack_deploy_prep output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make openstack_deploy_prep, dry_run={{ make_openstack_deploy_prep_dryrun|default(false)|bool }}, extra_args={{ dict((make_openstack_deploy_prep_env|default({})), **(make_openstack_deploy_prep_params|default({}))) }}] *** 2026-01-22 16:21:42,682 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.038) 0:07:32.786 ****** 2026-01-22 16:21:42,682 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:42 +0000 (0:00:00.038) 0:07:32.785 ****** 2026-01-22 16:21:42,739 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_007_run_openstack_deploy.log 2026-01-22 16:21:44,030 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_openstack_deploy_prep_until | default(true) }} 2026-01-22 16:21:44,031 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:21:44,049 p=31411 u=zuul n=ansible | TASK [Deploy NetConfig name=install_yamls_makes, tasks_from=make_netconfig_deploy] *** 2026-01-22 16:21:44,049 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:01.367) 0:07:34.153 ****** 2026-01-22 16:21:44,050 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:01.367) 0:07:34.152 ****** 2026-01-22 16:21:44,108 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_netconfig_deploy_env var=make_netconfig_deploy_env] *** 2026-01-22 16:21:44,108 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.058) 0:07:34.212 ****** 2026-01-22 16:21:44,108 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.058) 0:07:34.211 ****** 2026-01-22 16:21:44,137 p=31411 u=zuul n=ansible | ok: [localhost] => make_netconfig_deploy_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NETWORK_MTU: 1500 NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator NNCP_DNS_SERVER: 192.168.122.10 NNCP_INTERFACE: ens7 OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin 2026-01-22 16:21:44,147 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_netconfig_deploy_params var=make_netconfig_deploy_params] *** 2026-01-22 16:21:44,147 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.038) 0:07:34.251 ****** 2026-01-22 16:21:44,147 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.038) 0:07:34.250 ****** 2026-01-22 16:21:44,172 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:44,181 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run netconfig_deploy output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make netconfig_deploy, dry_run={{ make_netconfig_deploy_dryrun|default(false)|bool }}, extra_args={{ dict((make_netconfig_deploy_env|default({})), **(make_netconfig_deploy_params|default({}))) }}] *** 2026-01-22 16:21:44,181 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.033) 0:07:34.285 ****** 2026-01-22 16:21:44,181 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:44 +0000 (0:00:00.033) 0:07:34.283 ****** 2026-01-22 16:21:44,237 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_008_run_netconfig.log 2026-01-22 16:21:50,296 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_netconfig_deploy_until | default(true) }} 2026-01-22 16:21:50,298 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:21:50,312 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Kustomize and deploy OpenStackControlPlane _raw_params=kustomize_and_deploy.yml] *** 2026-01-22 16:21:50,312 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:06.131) 0:07:40.416 ****** 2026-01-22 16:21:50,312 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:06.131) 0:07:40.415 ****** 2026-01-22 16:21:50,345 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/edpm_prepare/tasks/kustomize_and_deploy.yml for localhost 2026-01-22 16:21:50,362 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Controlplane name _ctlplane_name=controlplane] ************ 2026-01-22 16:21:50,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.050) 0:07:40.466 ****** 2026-01-22 16:21:50,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.050) 0:07:40.465 ****** 2026-01-22 16:21:50,388 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:21:50,397 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Set vars related to update_containers content provider cifmw_update_containers_registry={{ content_provider_os_registry_url | split('/') | first }}, cifmw_update_containers_org={{ content_provider_os_registry_url | split('/') | last }}, cifmw_update_containers_tag={{ content_provider_dlrn_md5_hash }}, cifmw_update_containers_openstack=True] *** 2026-01-22 16:21:50,397 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.034) 0:07:40.501 ****** 2026-01-22 16:21:50,397 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.034) 0:07:40.499 ****** 2026-01-22 16:21:50,417 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:50,425 p=31411 u=zuul n=ansible | TASK [Prepare OpenStackVersion CR name=update_containers] ********************** 2026-01-22 16:21:50,425 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.028) 0:07:40.529 ****** 2026-01-22 16:21:50,425 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.028) 0:07:40.528 ****** 2026-01-22 16:21:50,446 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:21:50,455 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Controlplane name kustomization _ctlplane_name_kustomizations=[{'apiVersion': 'kustomize.config.k8s.io/v1beta1', 'kind': 'Kustomization', 'patches': [{'target': {'kind': 'OpenStackControlPlane'}, 'patch': '- op: replace\n path: /metadata/name\n value: {{ _ctlplane_name }}'}]}]] *** 2026-01-22 16:21:50,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.029) 0:07:40.559 ****** 2026-01-22 16:21:50,455 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.029) 0:07:40.557 ****** 2026-01-22 16:21:50,477 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:21:50,492 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Perform kustomizations to the OpenStackControlPlane CR target_path={{ cifmw_edpm_prepare_openstack_crs_path }}, sort_ascending=False, kustomizations={{ cifmw_edpm_prepare_kustomizations + _ctlplane_name_kustomizations + (cifmw_edpm_prepare_extra_kustomizations | default([])) }}, kustomizations_paths={{ [ ( [ cifmw_edpm_prepare_manifests_dir, 'kustomizations', 'controlplane' ] | ansible.builtin.path_join ) ] }}] *** 2026-01-22 16:21:50,492 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.037) 0:07:40.596 ****** 2026-01-22 16:21:50,492 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:50 +0000 (0:00:00.037) 0:07:40.595 ****** 2026-01-22 16:21:51,419 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:21:51,431 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Log the CR that is about to be applied var=cifmw_edpm_prepare_crs_kustomize_result] *** 2026-01-22 16:21:51,431 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.938) 0:07:41.535 ****** 2026-01-22 16:21:51,431 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.938) 0:07:41.533 ****** 2026-01-22 16:21:51,468 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_edpm_prepare_crs_kustomize_result: changed: true count: 4 failed: false kustomizations_paths: - /home/zuul/ci-framework-data/artifacts/manifests/openstack/openstack/cr/kustomization.yaml - /home/zuul/ci-framework-data/artifacts/manifests/kustomizations/controlplane/99-kustomization.yaml - /home/zuul/ci-framework-data/artifacts/manifests/kustomizations/controlplane/80-horizon-kustomization.yaml output_path: /home/zuul/ci-framework-data/artifacts/manifests/openstack/openstack/cr/cifmw-kustomization-result.yaml result: - apiVersion: core.openstack.org/v1beta1 kind: OpenStackControlPlane metadata: labels: created-by: install_yamls name: controlplane namespace: openstack spec: barbican: apiOverride: route: {} template: barbicanAPI: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 barbicanKeystoneListener: replicas: 1 barbicanWorker: replicas: 1 databaseInstance: openstack secret: os**********et cinder: apiOverride: route: {} template: cinderAPI: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer cinderBackup: networkAttachments: - storage replicas: 0 cinderScheduler: replicas: 1 cinderVolumes: volume1: networkAttachments: - storage replicas: 0 databaseInstance: openstack secret: os**********et designate: apiOverride: route: {} enabled: false template: databaseInstance: openstack designateAPI: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer designateBackendbind9: networkAttachments: - designate replicas: 1 storageClass: local-storage storageRequest: 10G designateCentral: replicas: 1 designateMdns: networkAttachments: - designate replicas: 1 designateProducer: replicas: 1 designateWorker: networkAttachments: - designate replicas: 1 secret: os**********et dns: template: options: - key: server values: - 192.168.122.10 - key: no-negcache values: [] override: service: metadata: annotations: metallb.universe.tf/address-pool: ctlplane metallb.universe.tf/allow-shared-ip: ctlplane metallb.universe.tf/loadBalancerIPs: 192.168.122.80 spec: type: LoadBalancer replicas: 1 galera: templates: openstack: replicas: 1 secret: os**********et storageRequest: 10G openstack-cell1: replicas: 1 secret: os**********et storageRequest: 10G glance: apiOverrides: default: route: {} template: customServiceConfig: | [DEFAULT] enabled_backends = default_backend:swift [glance_store] default_backend = default_backend [default_backend] swift_store_create_container_on_put = True swift_store_auth_version = 3 swift_store_auth_address = {{ .KeystoneInternalURL }} swift_store_endpoint_type = internalURL swift_store_user = service:glance swift_store_key = {{ .ServicePassword }} databaseInstance: openstack glanceAPIs: default: networkAttachments: - storage override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 keystoneEndpoint: default secret: os**********et storage: storageClass: '' storageRequest: 10G heat: apiOverride: route: {} cnfAPIOverride: route: {} enabled: false template: databaseInstance: openstack heatAPI: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 heatEngine: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 secret: os**********et horizon: apiOverride: route: {} enabled: true template: memcachedInstance: memcached replicas: 1 secret: os**********et ironic: enabled: false template: databaseInstance: openstack ironicAPI: replicas: 1 ironicConductors: - replicas: 1 storageRequest: 10G ironicInspector: replicas: 1 ironicNeutronAgent: replicas: 1 secret: os**********et keystone: apiOverride: route: {} template: databaseInstance: openstack override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer secret: os**********et manila: apiOverride: route: {} template: databaseInstance: openstack manilaAPI: networkAttachments: - internalapi override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 manilaScheduler: replicas: 1 manilaShares: share1: networkAttachments: - storage replicas: 1 memcached: templates: memcached: replicas: 1 neutron: apiOverride: route: {} template: databaseInstance: openstack networkAttachments: - internalapi override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer secret: os**********et nova: apiOverride: route: {} template: apiServiceTemplate: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer cellTemplates: cell0: cellDatabaseAccount: nova-cell0 cellDatabaseInstance: openstack cellMessageBusInstance: rabbitmq conductorServiceTemplate: replicas: 1 hasAPIAccess: true cell1: cellDatabaseAccount: nova-cell1 cellDatabaseInstance: openstack-cell1 cellMessageBusInstance: rabbitmq-cell1 conductorServiceTemplate: replicas: 1 hasAPIAccess: true metadataServiceTemplate: override: service: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer secret: os**********et octavia: enabled: false template: databaseInstance: openstack octaviaAPI: replicas: 1 secret: os**********et ovn: template: ovnController: networkAttachment: tenant nicMappings: datacentre: ospbr ovnDBCluster: ovndbcluster-nb: dbType: NB networkAttachment: internalapi storageRequest: 10G ovndbcluster-sb: dbType: SB networkAttachment: internalapi storageRequest: 10G placement: apiOverride: route: {} template: databaseInstance: openstack override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer secret: os**********et rabbitmq: templates: rabbitmq: override: service: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.85 spec: type: LoadBalancer rabbitmq-cell1: override: service: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.86 spec: type: LoadBalancer redis: enabled: false secret: os**********et storageClass: local-storage swift: enabled: true proxyOverride: route: {} template: swiftProxy: networkAttachments: - storage override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 swiftRing: ringReplicas: 1 swiftStorage: networkAttachments: - storage replicas: 1 telemetry: enabled: true template: autoscaling: aodh: databaseAccount: aodh databaseInstance: openstack passwordSelectors: null secret: os**********et enabled: false heatInstance: heat ceilometer: enabled: true secret: os**********et cloudkitty: apiTimeout: 0 cloudKittyAPI: override: service: internal: metadata: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 spec: type: LoadBalancer replicas: 1 resources: {} tls: api: internal: {} public: {} caBundleSecretName: combined-ca-bundle cloudKittyProc: replicas: 1 resources: {} tls: caBundleSecretName: combined-ca-bundle databaseAccount: cloudkitty databaseInstance: openstack enabled: false memcachedInstance: memcached passwordSelector: aodhService: AodhPassword ceilometerService: CeilometerPassword cloudKittyService: CloudKittyPassword preserveJobs: false rabbitMqClusterName: rabbitmq s3StorageConfig: schemas: - effectiveDate: '2024-11-18' version: v13 secret: name: cloudkitty-loki-s3 type: s3 secret: os**********et serviceUser: cloudkitty storageClass: local-storage logging: annotations: metallb.universe.tf/address-pool: internalapi metallb.universe.tf/allow-shared-ip: internalapi metallb.universe.tf/loadBalancerIPs: 172.17.0.80 cloNamespace: openshift-logging enabled: false ipaddr: 172.17.0.80 port: 10514 metricStorage: enabled: false monitoringStack: alertingEnabled: true scrapeInterval: 30s storage: persistent: pvcStorageRequest: 10G retention: 24h strategy: persistent 2026-01-22 16:21:51,477 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Apply the OpenStackControlPlane CR output_dir={{ cifmw_edpm_prepare_basedir }}/artifacts, script=oc apply -f {{ cifmw_edpm_prepare_crs_kustomize_result.output_path }}] *** 2026-01-22 16:21:51,477 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.046) 0:07:41.581 ****** 2026-01-22 16:21:51,477 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.046) 0:07:41.580 ****** 2026-01-22 16:21:51,524 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_009_apply_the.log 2026-01-22 16:21:51,832 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:21:51,849 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Wait for control plane to change its status seconds={{ cifmw_edpm_prepare_wait_controplane_status_change_sec }}] *** 2026-01-22 16:21:51,850 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.372) 0:07:41.953 ****** 2026-01-22 16:21:51,850 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:21:51 +0000 (0:00:00.372) 0:07:41.952 ****** 2026-01-22 16:21:51,873 p=31411 u=zuul n=ansible | Pausing for 30 seconds 2026-01-22 16:22:21,899 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:22:21,909 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Wait for OpenStack controlplane to be deployed _raw_params=oc wait OpenStackControlPlane {{ _ctlplane_name }} --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for=condition=ready --timeout={{ cifmw_edpm_prepare_timeout }}m] *** 2026-01-22 16:22:21,909 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:22:21 +0000 (0:00:30.059) 0:08:12.013 ****** 2026-01-22 16:22:21,909 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:22:21 +0000 (0:00:30.059) 0:08:12.012 ****** 2026-01-22 16:28:01,850 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:01,865 p=31411 u=zuul n=ansible | TASK [Extract and install OpenStackControlplane CA role=install_openstack_ca] *** 2026-01-22 16:28:01,865 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:01 +0000 (0:05:39.955) 0:13:51.969 ****** 2026-01-22 16:28:01,865 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:01 +0000 (0:05:39.955) 0:13:51.968 ****** 2026-01-22 16:28:01,957 p=31411 u=zuul n=ansible | TASK [install_openstack_ca : Get CA bundle data with retries] ****************** 2026-01-22 16:28:01,957 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:01 +0000 (0:00:00.092) 0:13:52.061 ****** 2026-01-22 16:28:01,957 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:01 +0000 (0:00:00.091) 0:13:52.060 ****** 2026-01-22 16:28:02,350 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:02,363 p=31411 u=zuul n=ansible | TASK [install_openstack_ca : Set _ca_bundle fact if CA returned from OCP] ****** 2026-01-22 16:28:02,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.405) 0:13:52.467 ****** 2026-01-22 16:28:02,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.405) 0:13:52.465 ****** 2026-01-22 16:28:02,399 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:02,413 p=31411 u=zuul n=ansible | TASK [install_openstack_ca : Creating tls-ca-bundle.pem from CA bundle dest={{ cifmw_install_openstack_ca_file_full_path }}, content={{ _ca_bundle }}, mode=0644] *** 2026-01-22 16:28:02,414 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.050) 0:13:52.518 ****** 2026-01-22 16:28:02,414 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.050) 0:13:52.516 ****** 2026-01-22 16:28:02,862 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:02,879 p=31411 u=zuul n=ansible | TASK [install_openstack_ca : Check if OpenStackControlplane CA file is present path={{ cifmw_install_openstack_ca_file_full_path }}, get_attributes=False, get_checksum=False, get_mime=False] *** 2026-01-22 16:28:02,879 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.465) 0:13:52.983 ****** 2026-01-22 16:28:02,879 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:02 +0000 (0:00:00.465) 0:13:52.982 ****** 2026-01-22 16:28:03,066 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:03,074 p=31411 u=zuul n=ansible | TASK [Call install_ca role to inject OpenStackControlplane CA file if present role=install_ca] *** 2026-01-22 16:28:03,074 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.195) 0:13:53.178 ****** 2026-01-22 16:28:03,074 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.195) 0:13:53.177 ****** 2026-01-22 16:28:03,125 p=31411 u=zuul n=ansible | TASK [install_ca : Ensure target directory exists path={{ cifmw_install_ca_trust_dir }}, state=directory, mode=0755] *** 2026-01-22 16:28:03,125 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.051) 0:13:53.229 ****** 2026-01-22 16:28:03,126 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.051) 0:13:53.228 ****** 2026-01-22 16:28:03,345 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:03,353 p=31411 u=zuul n=ansible | TASK [install_ca : Install internal CA from url url={{ cifmw_install_ca_url }}, dest={{ cifmw_install_ca_trust_dir }}, validate_certs={{ cifmw_install_ca_url_validate_certs | default(omit) }}, mode=0644] *** 2026-01-22 16:28:03,353 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.227) 0:13:53.457 ****** 2026-01-22 16:28:03,353 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.227) 0:13:53.455 ****** 2026-01-22 16:28:03,375 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:03,383 p=31411 u=zuul n=ansible | TASK [install_ca : Install custom CA bundle from inline dest={{ cifmw_install_ca_trust_dir }}/cifmw_inline_ca_bundle.crt, content={{ cifmw_install_ca_bundle_inline }}, mode=0644] *** 2026-01-22 16:28:03,383 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.030) 0:13:53.487 ****** 2026-01-22 16:28:03,383 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.030) 0:13:53.486 ****** 2026-01-22 16:28:03,408 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:03,416 p=31411 u=zuul n=ansible | TASK [install_ca : Install custom CA bundle from file dest={{ cifmw_install_ca_trust_dir }}/{{ cifmw_install_ca_bundle_src | basename }}, src={{ cifmw_install_ca_bundle_src }}, mode=0644] *** 2026-01-22 16:28:03,416 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.032) 0:13:53.520 ****** 2026-01-22 16:28:03,416 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.032) 0:13:53.519 ****** 2026-01-22 16:28:03,867 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:03,875 p=31411 u=zuul n=ansible | TASK [install_ca : Update ca bundle _raw_params=update-ca-trust] *************** 2026-01-22 16:28:03,875 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.458) 0:13:53.979 ****** 2026-01-22 16:28:03,875 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:03 +0000 (0:00:00.458) 0:13:53.978 ****** 2026-01-22 16:28:05,571 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:05,591 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Extract keystone endpoint host _raw_params=oc get keystoneapi keystone --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} -o jsonpath='{ .status.apiEndpoints.public }'] *** 2026-01-22 16:28:05,591 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:05 +0000 (0:00:01.715) 0:13:55.695 ****** 2026-01-22 16:28:05,591 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:05 +0000 (0:00:01.715) 0:13:55.693 ****** 2026-01-22 16:28:05,958 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:05,965 p=31411 u=zuul n=ansible | TASK [edpm_prepare : Wait for keystone endpoint to exist in DNS url={{ _cifmw_edpm_prepare_keystone_endpoint_out.stdout | trim }}, status_code={{ _keystone_response_codes }}, validate_certs={{ cifmw_edpm_prepare_verify_tls }}] *** 2026-01-22 16:28:05,966 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:05 +0000 (0:00:00.374) 0:13:56.069 ****** 2026-01-22 16:28:05,966 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:05 +0000 (0:00:00.374) 0:13:56.068 ****** 2026-01-22 16:28:06,438 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:06,469 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:28:06,469 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.503) 0:13:56.573 ****** 2026-01-22 16:28:06,469 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.503) 0:13:56.571 ****** 2026-01-22 16:28:06,532 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:06,556 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:28:06,557 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.088) 0:13:56.661 ****** 2026-01-22 16:28:06,557 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.088) 0:13:56.660 ****** 2026-01-22 16:28:06,669 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:06,679 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_ctlplane_deploy _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:28:06,679 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.121) 0:13:56.783 ****** 2026-01-22 16:28:06,679 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.121) 0:13:56.781 ****** 2026-01-22 16:28:06,804 p=31411 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/run_hook/tasks/playbook.yml for localhost => (item={'name': 'Tune rabbitmq resources', 'type': 'playbook', 'source': 'rabbitmq_tuning.yml'}) 2026-01-22 16:28:06,823 p=31411 u=zuul n=ansible | TASK [run_hook : Set playbook path for Tune rabbitmq resources cifmw_basedir={{ _bdir }}, hook_name={{ _hook_name }}, playbook_path={{ _play | realpath }}, log_path={{ _bdir }}/logs/{{ step }}_{{ _hook_name }}.log, extra_vars=-e namespace={{ cifmw_openstack_namespace }} {%- if hook.extra_vars is defined and hook.extra_vars|length > 0 -%} {% for key,value in hook.extra_vars.items() -%} {%- if key == 'file' %} -e "@{{ value }}" {%- else %} -e "{{ key }}={{ value }}" {%- endif %} {%- endfor %} {%- endif %}] *** 2026-01-22 16:28:06,824 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.144) 0:13:56.928 ****** 2026-01-22 16:28:06,824 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.144) 0:13:56.926 ****** 2026-01-22 16:28:06,883 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:06,892 p=31411 u=zuul n=ansible | TASK [run_hook : Get file stat path={{ playbook_path }}] *********************** 2026-01-22 16:28:06,892 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.068) 0:13:56.996 ****** 2026-01-22 16:28:06,893 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:06 +0000 (0:00:00.068) 0:13:56.995 ****** 2026-01-22 16:28:07,099 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:07,108 p=31411 u=zuul n=ansible | TASK [run_hook : Fail if playbook doesn't exist msg=Playbook {{ playbook_path }} doesn't seem to exist.] *** 2026-01-22 16:28:07,108 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.216) 0:13:57.212 ****** 2026-01-22 16:28:07,109 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.216) 0:13:57.211 ****** 2026-01-22 16:28:07,132 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:07,141 p=31411 u=zuul n=ansible | TASK [run_hook : Get parameters files paths={{ (cifmw_basedir, 'artifacts/parameters') | path_join }}, file_type=file, patterns=*.yml] *** 2026-01-22 16:28:07,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.032) 0:13:57.245 ****** 2026-01-22 16:28:07,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.032) 0:13:57.244 ****** 2026-01-22 16:28:07,332 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:07,340 p=31411 u=zuul n=ansible | TASK [run_hook : Add parameters artifacts as extra variables extra_vars={{ extra_vars }} {% for file in cifmw_run_hook_parameters_files.files %} -e "@{{ file.path }}" {%- endfor %}] *** 2026-01-22 16:28:07,340 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.198) 0:13:57.444 ****** 2026-01-22 16:28:07,340 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.198) 0:13:57.443 ****** 2026-01-22 16:28:07,365 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:07,374 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure log directory exists path={{ log_path | dirname }}, state=directory, mode=0755] *** 2026-01-22 16:28:07,374 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.033) 0:13:57.478 ****** 2026-01-22 16:28:07,374 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.033) 0:13:57.476 ****** 2026-01-22 16:28:07,558 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:07,567 p=31411 u=zuul n=ansible | TASK [run_hook : Ensure artifacts directory exists path={{ cifmw_basedir }}/artifacts, state=directory, mode=0755] *** 2026-01-22 16:28:07,567 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.193) 0:13:57.671 ****** 2026-01-22 16:28:07,567 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.193) 0:13:57.669 ****** 2026-01-22 16:28:07,767 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:07,789 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook without retry - Tune rabbitmq resources] ************* 2026-01-22 16:28:07,790 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.222) 0:13:57.893 ****** 2026-01-22 16:28:07,790 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:07 +0000 (0:00:00.222) 0:13:57.892 ****** 2026-01-22 16:28:07,854 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_010_run_hook_without_retry_tune.log 2026-01-22 16:28:10,429 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:10,439 p=31411 u=zuul n=ansible | TASK [run_hook : Run hook with retry - Tune rabbitmq resources] **************** 2026-01-22 16:28:10,439 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:02.649) 0:14:00.543 ****** 2026-01-22 16:28:10,439 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:02.649) 0:14:00.542 ****** 2026-01-22 16:28:10,469 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,480 p=31411 u=zuul n=ansible | TASK [run_hook : Check if we have a file path={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:28:10,480 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.040) 0:14:00.584 ****** 2026-01-22 16:28:10,480 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.040) 0:14:00.582 ****** 2026-01-22 16:28:10,653 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:10,663 p=31411 u=zuul n=ansible | TASK [run_hook : Load generated content in main playbook file={{ cifmw_basedir }}/artifacts/{{ step }}_{{ hook_name }}.yml] *** 2026-01-22 16:28:10,663 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.183) 0:14:00.767 ****** 2026-01-22 16:28:10,663 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.183) 0:14:00.766 ****** 2026-01-22 16:28:10,686 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,704 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:28:10,704 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.040) 0:14:00.808 ****** 2026-01-22 16:28:10,704 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.040) 0:14:00.806 ****** 2026-01-22 16:28:10,760 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:10,770 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Define minimal set of repo variables when not running on Zuul _install_yamls_repos={{ ( { 'OPENSTACK_REPO': operators_build_output[cifmw_operator_build_meta_name].git_src_dir, 'OPENSTACK_BRANCH': '', 'GIT_CLONE_OPTS': '-l', } if (cifmw_operator_build_meta_name is defined and cifmw_operator_build_meta_name in operators_build_output) else {} ) }}] *** 2026-01-22 16:28:10,770 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.066) 0:14:00.874 ****** 2026-01-22 16:28:10,770 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.066) 0:14:00.873 ****** 2026-01-22 16:28:10,794 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,803 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Set install_yamls Makefile environment variables cifmw_edpm_deploy_baremetal_common_env={{ cifmw_install_yamls_environment | combine({'PATH': cifmw_path}) | combine(_install_yamls_repos | default({})) }}, cifmw_edpm_deploy_baremetal_make_openstack_env={{ cifmw_edpm_deploy_baremetal_make_openstack_env | default({}) | combine( { 'OPENSTACK_IMG': operators_build_output[cifmw_operator_build_meta_name].image_catalog, } if (cifmw_operator_build_meta_name is defined and cifmw_operator_build_meta_name in operators_build_output) else {} ) }}, cifmw_edpm_deploy_baremetal_operators_build_output={{ operators_build_output }}] *** 2026-01-22 16:28:10,803 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:00.907 ****** 2026-01-22 16:28:10,803 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:00.906 ****** 2026-01-22 16:28:10,827 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,837 p=31411 u=zuul n=ansible | TASK [Create virtual baremetal name=install_yamls_makes, tasks_from=make_edpm_baremetal_compute] *** 2026-01-22 16:28:10,837 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:00.941 ****** 2026-01-22 16:28:10,837 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:00.939 ****** 2026-01-22 16:28:10,860 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,868 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Create the config file mode=0644, content={{ cifmw_edpm_deploy_baremetal_nova_compute_extra_config }}, dest={{ _cifmw_edpm_deploy_baremetal_nova_extra_config_file }}] *** 2026-01-22 16:28:10,869 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.031) 0:14:00.972 ****** 2026-01-22 16:28:10,869 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.031) 0:14:00.971 ****** 2026-01-22 16:28:10,893 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,901 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Define DATAPLANE_EXTRA_NOVA_CONFIG_FILE cifmw_edpm_deploy_baremetal_common_env={{ cifmw_edpm_deploy_baremetal_common_env | default({}) | combine({'DATAPLANE_EXTRA_NOVA_CONFIG_FILE': _cifmw_edpm_deploy_baremetal_nova_extra_config_file }) }}, cacheable=True] *** 2026-01-22 16:28:10,901 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.032) 0:14:01.005 ****** 2026-01-22 16:28:10,901 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.032) 0:14:01.003 ****** 2026-01-22 16:28:10,925 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,934 p=31411 u=zuul n=ansible | TASK [Prepare OpenStack Dataplane NodeSet CR name=install_yamls_makes, tasks_from=make_edpm_deploy_baremetal_prep] *** 2026-01-22 16:28:10,934 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:01.038 ****** 2026-01-22 16:28:10,934 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:01.037 ****** 2026-01-22 16:28:10,958 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:10,968 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Perform kustomizations to the OpenStackDataPlaneNodeSet CR target_path={{ cifmw_edpm_deploy_openstack_crs_path }}, sort_ascending=False, kustomizations={% if content_provider_registry_ip is defined or not cifmw_edpm_deploy_baremetal_bootc %} apiVersion: kustomize.config.k8s.io/v1beta1 kind: Kustomization patches: - target: kind: OpenStackDataPlaneNodeSet patch: |- {% if content_provider_registry_ip is defined %} - op: add path: /spec/nodeTemplate/ansible/ansibleVars/edpm_container_registry_insecure_registries value: ["{{ content_provider_registry_ip }}:5001"] {% endif %} {% if not cifmw_edpm_deploy_baremetal_bootc %} - op: add path: /spec/nodeTemplate/ansible/ansibleVars/edpm_bootstrap_command value: sudo dnf -y update {% endif %} {% endif %}, kustomizations_paths={{ [ ( [ cifmw_edpm_deploy_baremetal_manifests_dir, 'kustomizations', 'dataplane' ] | ansible.builtin.path_join ) ] }}] *** 2026-01-22 16:28:10,968 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:01.072 ****** 2026-01-22 16:28:10,968 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:10 +0000 (0:00:00.033) 0:14:01.070 ****** 2026-01-22 16:28:10,993 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,004 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Log the CR that is about to be applied var=cifmw_edpm_deploy_baremetal_crs_kustomize_result] *** 2026-01-22 16:28:11,005 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.036) 0:14:01.109 ****** 2026-01-22 16:28:11,005 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.036) 0:14:01.107 ****** 2026-01-22 16:28:11,029 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,039 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Create repo-setup-downstream OpenStackDataPlaneService _raw_params=oc apply -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }} -f "{{ cifmw_installyamls_repos }}/devsetup/edpm/services/dataplane_v1beta1_openstackdataplaneservice_reposetup_downstream.yaml"] *** 2026-01-22 16:28:11,039 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.143 ****** 2026-01-22 16:28:11,039 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.142 ****** 2026-01-22 16:28:11,063 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,073 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Get list of services defined under OpenStackDataPlaneNodeSet resource _raw_params=yq '.spec.services[]' {{ cifmw_edpm_deploy_baremetal_crs_kustomize_result.output_path }}] *** 2026-01-22 16:28:11,073 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.177 ****** 2026-01-22 16:28:11,073 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.176 ****** 2026-01-22 16:28:11,098 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,107 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Patch OpenStackDataPlaneNodeSet resource to add "repo-setup-downstream" service _raw_params=yq -i '.spec.services = ["repo-setup-downstream"] + .spec.services' {{ cifmw_edpm_deploy_baremetal_crs_kustomize_result.output_path }}] *** 2026-01-22 16:28:11,107 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.211 ****** 2026-01-22 16:28:11,107 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.210 ****** 2026-01-22 16:28:11,131 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,140 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Patch OpenStackDataPlaneNodeSet resource to replace "repo-setup" with "repo-setup-downstream" service _raw_params=yq -i '(.spec.services[] | select(. == "repo-setup")) |= "repo-setup-downstream"' {{ cifmw_edpm_deploy_baremetal_crs_kustomize_result.output_path }}] *** 2026-01-22 16:28:11,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.245 ****** 2026-01-22 16:28:11,141 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.033) 0:14:01.243 ****** 2026-01-22 16:28:11,164 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,173 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Apply the OpenStackDataPlaneNodeSet CR output_dir={{ cifmw_edpm_deploy_baremetal_basedir }}/artifacts, script=oc apply -f {{ cifmw_edpm_deploy_baremetal_crs_kustomize_result.output_path }}] *** 2026-01-22 16:28:11,173 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.032) 0:14:01.277 ****** 2026-01-22 16:28:11,174 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.032) 0:14:01.276 ****** 2026-01-22 16:28:11,200 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,208 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Wait for Ironic to be ready _raw_params=oc wait pod -l name=ironic -n baremetal-operator-system --for=condition=Ready --timeout={{ cifmw_edpm_deploy_baremetal_wait_ironic_timeout_mins }}m] *** 2026-01-22 16:28:11,208 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.312 ****** 2026-01-22 16:28:11,208 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.311 ****** 2026-01-22 16:28:11,229 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,238 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Wait for OpenStack Provision Server pod to be created _raw_params=oc get po -l osp-provisionserver/name=openstack-edpm-ipam-provisionserver -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }} -o name] *** 2026-01-22 16:28:11,238 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.342 ****** 2026-01-22 16:28:11,239 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.341 ****** 2026-01-22 16:28:11,260 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,269 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Wait for OpenStack Provision Server deployment to be available _raw_params=oc wait deployment openstack-edpm-ipam-provisionserver-openstackprovisionserver -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for condition=Available --timeout={{ cifmw_edpm_deploy_baremetal_wait_provisionserver_timeout_mins }}m] *** 2026-01-22 16:28:11,270 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.031) 0:14:01.373 ****** 2026-01-22 16:28:11,270 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.031) 0:14:01.372 ****** 2026-01-22 16:28:11,291 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,300 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Wait for baremetal nodes to reach 'provisioned' state _raw_params=oc wait bmh --all -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for=jsonpath='{.status.provisioning.state}'=provisioned --timeout={{ cifmw_edpm_deploy_baremetal_wait_bmh_timeout_mins }}m] *** 2026-01-22 16:28:11,300 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.404 ****** 2026-01-22 16:28:11,300 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.402 ****** 2026-01-22 16:28:11,322 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,331 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Register the list of compute nodes _raw_params=oc get bmh -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }}] *** 2026-01-22 16:28:11,331 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.435 ****** 2026-01-22 16:28:11,331 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.433 ****** 2026-01-22 16:28:11,353 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,363 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Print the list of compute nodes var=compute_nodes_output.stdout_lines] *** 2026-01-22 16:28:11,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.031) 0:14:01.467 ****** 2026-01-22 16:28:11,363 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.031) 0:14:01.465 ****** 2026-01-22 16:28:11,384 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,393 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Wait for OpenStackDataPlaneNodeSet to be deployed _raw_params=oc wait OpenStackDataPlaneNodeSet {{ cr_name }} --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for=condition=ready --timeout={{ cifmw_edpm_deploy_baremetal_wait_dataplane_timeout_mins }}m] *** 2026-01-22 16:28:11,393 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.497 ****** 2026-01-22 16:28:11,394 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.030) 0:14:01.496 ****** 2026-01-22 16:28:11,414 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,423 p=31411 u=zuul n=ansible | TASK [edpm_deploy_baremetal : Run nova-manage discover_hosts to ensure compute nodes are mapped _raw_params=oc rsh -n {{ cifmw_install_yamls_defaults['NAMESPACE'] }} nova-cell0-conductor-0 nova-manage cell_v2 discover_hosts --verbose] *** 2026-01-22 16:28:11,423 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.029) 0:14:01.527 ****** 2026-01-22 16:28:11,423 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.029) 0:14:01.525 ****** 2026-01-22 16:28:11,443 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,458 p=31411 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:28:11,459 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.035) 0:14:01.562 ****** 2026-01-22 16:28:11,459 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.035) 0:14:01.561 ****** 2026-01-22 16:28:11,500 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:11,509 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Set compute config and common environment facts compute_config={{ cifmw_libvirt_manager_configuration['vms']['compute'] }}, cifmw_libvirt_manager_common_env={{ cifmw_install_yamls_environment | combine({'PATH': cifmw_path }) }}, cacheable=True] *** 2026-01-22 16:28:11,510 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.050) 0:14:01.613 ****** 2026-01-22 16:28:11,510 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.050) 0:14:01.612 ****** 2026-01-22 16:28:11,534 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,542 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Ensure needed directories exist path={{ item }}, state=directory, mode=0755] *** 2026-01-22 16:28:11,542 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.032) 0:14:01.646 ****** 2026-01-22 16:28:11,542 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.032) 0:14:01.645 ****** 2026-01-22 16:28:11,576 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=/home/zuul/ci-framework-data/workload) 2026-01-22 16:28:11,586 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/edpm_compute) 2026-01-22 16:28:11,597 p=31411 u=zuul n=ansible | skipping: [localhost] => (item=/home/zuul/ci-framework-data/artifacts/openstack/cr/) 2026-01-22 16:28:11,598 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,607 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Ensure image is available _raw_params=get_image.yml] *** 2026-01-22 16:28:11,607 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.065) 0:14:01.711 ****** 2026-01-22 16:28:11,607 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.065) 0:14:01.710 ****** 2026-01-22 16:28:11,632 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,641 p=31411 u=zuul n=ansible | TASK [Create EDPM compute VMs name=install_yamls_makes, tasks_from=make_edpm_compute.yml] *** 2026-01-22 16:28:11,642 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.745 ****** 2026-01-22 16:28:11,642 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.034) 0:14:01.744 ****** 2026-01-22 16:28:11,671 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,680 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Catch compute IPs _raw_params=virsh -c qemu:///system -q domifaddr --source arp --domain edpm-compute-{{ item }}] *** 2026-01-22 16:28:11,680 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.038) 0:14:01.784 ****** 2026-01-22 16:28:11,680 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.038) 0:14:01.783 ****** 2026-01-22 16:28:11,710 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,720 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Ensure we get SSH host={{ item.stdout.split()[-1].split('/')[0] }}, port=22, timeout=60] *** 2026-01-22 16:28:11,720 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.039) 0:14:01.824 ****** 2026-01-22 16:28:11,720 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.039) 0:14:01.822 ****** 2026-01-22 16:28:11,747 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,756 p=31411 u=zuul n=ansible | TASK [libvirt_manager : Output CR for extra computes dest={{ cifmw_libvirt_manager_basedir }}/artifacts/{{ cifmw_install_yamls_defaults['NAMESPACE'] }}/cr/99-cifmw-computes-{{ item }}.yaml, src=kustomize_compute.yml.j2, mode=0644] *** 2026-01-22 16:28:11,756 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.036) 0:14:01.860 ****** 2026-01-22 16:28:11,756 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.036) 0:14:01.859 ****** 2026-01-22 16:28:11,786 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,801 p=31411 u=zuul n=ansible | TASK [Prepare for HCI deploy phase 1 name=hci_prepare, tasks_from=phase1.yml] *** 2026-01-22 16:28:11,801 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.045) 0:14:01.905 ****** 2026-01-22 16:28:11,801 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.045) 0:14:01.904 ****** 2026-01-22 16:28:11,829 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:11,839 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Set EDPM related vars cifmw_edpm_deploy_env={{ cifmw_install_yamls_environment | combine({'PATH': cifmw_path}) | combine({'DATAPLANE_REGISTRY_URL': cifmw_edpm_deploy_registry_url }) | combine({'DATAPLANE_CONTAINER_TAG': cifmw_repo_setup_full_hash | default(cifmw_install_yamls_defaults['DATAPLANE_CONTAINER_TAG']) }) | combine(cifmw_edpm_deploy_extra_vars | default({})) | combine(_install_yamls_repos | default({})) }}, cacheable=True] *** 2026-01-22 16:28:11,839 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.037) 0:14:01.943 ****** 2026-01-22 16:28:11,839 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.037) 0:14:01.941 ****** 2026-01-22 16:28:11,885 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:11,894 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Create the config file mode=0644, content={{ cifmw_edpm_deploy_nova_compute_extra_config }}, dest={{ _cifmw_edpm_deploy_nova_extra_config_file }}] *** 2026-01-22 16:28:11,894 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.055) 0:14:01.998 ****** 2026-01-22 16:28:11,894 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:11 +0000 (0:00:00.055) 0:14:01.997 ****** 2026-01-22 16:28:12,306 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:12,314 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Define DATAPLANE_EXTRA_NOVA_CONFIG_FILE cifmw_edpm_deploy_env={{ cifmw_edpm_deploy_env | default({}) | combine({'DATAPLANE_EXTRA_NOVA_CONFIG_FILE': _cifmw_edpm_deploy_nova_extra_config_file }) }}, cacheable=True] *** 2026-01-22 16:28:12,314 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.420) 0:14:02.418 ****** 2026-01-22 16:28:12,314 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.419) 0:14:02.417 ****** 2026-01-22 16:28:12,371 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:28:12,380 p=31411 u=zuul n=ansible | TASK [Prepare OpenStack Dataplane NodeSet CR name=install_yamls_makes, tasks_from=make_edpm_deploy_prep] *** 2026-01-22 16:28:12,380 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.065) 0:14:02.484 ****** 2026-01-22 16:28:12,380 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.065) 0:14:02.482 ****** 2026-01-22 16:28:12,438 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_edpm_deploy_prep_env var=make_edpm_deploy_prep_env] *** 2026-01-22 16:28:12,438 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.058) 0:14:02.542 ****** 2026-01-22 16:28:12,438 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.058) 0:14:02.541 ****** 2026-01-22 16:28:12,470 p=31411 u=zuul n=ansible | ok: [localhost] => make_edpm_deploy_prep_env: BMO_SETUP: false CHECKOUT_FROM_OPENSTACK_REF: 'true' DATAPLANE_COMPUTE_IP: 192.168.122.100 DATAPLANE_CONTAINER_TAG: c3923531bcda0b0811b2d5053f189beb DATAPLANE_EXTRA_NOVA_CONFIG_FILE: /home/zuul/ci-framework-data/nova-extra-config.conf DATAPLANE_REGISTRY_URL: quay.io/podified-antelope-centos9 DATAPLANE_SINGLE_NODE: 'true' DATAPLANE_SSHD_ALLOWED_RANGES: '[''0.0.0.0/0'']' DATAPLANE_TOTAL_NODES: 1 INSTALL_CERT_MANAGER: false KUBECONFIG: /home/zuul/.crc/machines/crc/kubeconfig NEUTRON_BRANCH: '' NEUTRON_REPO: /home/zuul/src/github.com/openstack-k8s-operators/neutron-operator OPENSTACK_K8S_BRANCH: main OUT: /home/zuul/ci-framework-data/artifacts/manifests OUTPUT_DIR: /home/zuul/ci-framework-data/artifacts/edpm PATH: /home/zuul/.crc/bin:/home/zuul/.crc/bin/oc:/home/zuul/bin:/home/zuul/.local/bin:/home/zuul/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin SSH_KEY_FILE: /home/zuul/.ssh/id_cifw 2026-01-22 16:28:12,478 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Debug make_edpm_deploy_prep_params var=make_edpm_deploy_prep_params] *** 2026-01-22 16:28:12,478 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.039) 0:14:02.582 ****** 2026-01-22 16:28:12,478 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.040) 0:14:02.581 ****** 2026-01-22 16:28:12,505 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:28:12,513 p=31411 u=zuul n=ansible | TASK [install_yamls_makes : Run edpm_deploy_prep output_dir={{ cifmw_basedir }}/artifacts, chdir=/home/zuul/src/github.com/openstack-k8s-operators/install_yamls, script=make edpm_deploy_prep, dry_run={{ make_edpm_deploy_prep_dryrun|default(false)|bool }}, extra_args={{ dict((make_edpm_deploy_prep_env|default({})), **(make_edpm_deploy_prep_params|default({}))) }}] *** 2026-01-22 16:28:12,513 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.034) 0:14:02.617 ****** 2026-01-22 16:28:12,513 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:12 +0000 (0:00:00.034) 0:14:02.616 ****** 2026-01-22 16:28:12,567 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_011_run_edpm_deploy.log 2026-01-22 16:28:23,173 p=31411 u=zuul n=ansible | [WARNING]: conditional statements should not include jinja2 templating delimiters such as {{ }} or {% %}. Found: {{ make_edpm_deploy_prep_until | default(true) }} 2026-01-22 16:28:23,174 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:23,190 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Perform kustomizations to the OpenStackDataPlaneNodeSet CR target_path={{ cifmw_edpm_deploy_openstack_crs_path }}, sort_ascending=False, kustomizations_paths={{ [ ( [ cifmw_edpm_deploy_manifests_dir, 'kustomizations', 'dataplane' ] | ansible.builtin.path_join ) ] }}] *** 2026-01-22 16:28:23,190 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:10.676) 0:14:13.294 ****** 2026-01-22 16:28:23,190 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:10.676) 0:14:13.292 ****** 2026-01-22 16:28:23,547 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:23,556 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Log the CR that is about to be applied var=cifmw_edpm_deploy_crs_kustomize_result] *** 2026-01-22 16:28:23,556 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:00.366) 0:14:13.660 ****** 2026-01-22 16:28:23,556 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:00.366) 0:14:13.659 ****** 2026-01-22 16:28:23,595 p=31411 u=zuul n=ansible | ok: [localhost] => cifmw_edpm_deploy_crs_kustomize_result: changed: true count: 2 failed: false kustomizations_paths: - /home/zuul/ci-framework-data/artifacts/manifests/openstack/dataplane/cr/kustomization.yaml - /home/zuul/ci-framework-data/artifacts/manifests/kustomizations/dataplane/99-kustomization.yaml output_path: /home/zuul/ci-framework-data/artifacts/manifests/openstack/dataplane/cr/cifmw-kustomization-result.yaml result: - apiVersion: v1 data: network_config_template: | --- {% set mtu_list = [ctlplane_mtu] %} {% for network in nodeset_networks %} {% set _ = mtu_list.append(lookup('vars', networks_lower[network] ~ '_mtu')) %} {%- endfor %} {% set min_viable_mtu = mtu_list | max %} network_config: - type: ovs_bridge name: {{ neutron_physical_bridge_name }} mtu: {{ min_viable_mtu }} use_dhcp: false dns_servers: {{ ctlplane_dns_nameservers }} domain: {{ dns_search_domains }} addresses: - ip_netmask: {{ ctlplane_ip }}/{{ ctlplane_cidr }} routes: {{ ctlplane_host_routes }} members: - type: interface name: nic1 mtu: {{ min_viable_mtu }} # force the MAC address of the bridge to this interface primary: true {% for network in nodeset_networks %} - type: vlan mtu: {{ lookup('vars', networks_lower[network] ~ '_mtu') }} vlan_id: {{ lookup('vars', networks_lower[network] ~ '_vlan_id') }} addresses: - ip_netmask: {{ lookup('vars', networks_lower[network] ~ '_ip') }}/{{ lookup('vars', networks_lower[network] ~ '_cidr') }} routes: {{ lookup('vars', networks_lower[network] ~ '_host_routes') }} {% endfor %} kind: ConfigMap metadata: labels: created-by: install_yamls name: network-config-template-ipam namespace: openstack - apiVersion: v1 data: physical_bridge_name: br-ex public_interface_name: eth0 kind: ConfigMap metadata: labels: created-by: install_yamls name: neutron-edpm-ipam namespace: openstack - apiVersion: v1 data: 25-nova-extra.conf: | [DEFAULT] force_config_drive = false kind: ConfigMap metadata: labels: created-by: install_yamls name: nova-extra-config namespace: openstack - apiVersion: dataplane.openstack.org/v1beta1 kind: OpenStackDataPlaneDeployment metadata: labels: created-by: install_yamls name: edpm-deployment namespace: openstack spec: nodeSets: - openstack-edpm-ipam - apiVersion: dataplane.openstack.org/v1beta1 kind: OpenStackDataPlaneNodeSet metadata: labels: created-by: install_yamls name: openstack-edpm-ipam namespace: openstack spec: env: - name: ANSIBLE_VERBOSITY value: '2' networkAttachments: - ctlplane nodeTemplate: ansible: ansibleUser: zuul ansibleVars: ctlplane_dns_nameservers: - 192.168.122.10 - 199.204.44.24 edpm_container_registry_insecure_registries: - 38.102.83.113:5001 edpm_network_config_debug: true edpm_network_config_template: |- --- {% set mtu_list = [ctlplane_mtu] %} {% for network in nodeset_networks %} {% set _ = mtu_list.append(lookup('vars', networks_lower[network] ~ '_mtu')) %} {%- endfor %} {% set min_viable_mtu = mtu_list | max %} network_config: - type: interface name: nic1 use_dhcp: true mtu: {{ min_viable_mtu }} - type: ovs_bridge name: {{ neutron_physical_bridge_name }} mtu: {{ min_viable_mtu }} use_dhcp: false dns_servers: {{ ctlplane_dns_nameservers }} domain: {{ dns_search_domains }} addresses: - ip_netmask: {{ ctlplane_ip }}/{{ ctlplane_cidr }} routes: {{ ctlplane_host_routes }} members: - type: interface name: nic2 mtu: {{ min_viable_mtu }} # force the MAC address of the bridge to this interface primary: true {% if edpm_network_config_nmstate | bool %} # this ovs_extra configuration fixes OSPRH-17551, but it will be not needed when FDP-1472 is resolved ovs_extra: - "set interface eth1 external-ids:ovn-egress-iface=true" {% endif %} {% for network in nodeset_networks %} - type: vlan mtu: {{ lookup('vars', networks_lower[network] ~ '_mtu') }} vlan_id: {{ lookup('vars', networks_lower[network] ~ '_vlan_id') }} addresses: - ip_netmask: {{ lookup('vars', networks_lower[network] ~ '_ip') }}/{{ lookup('vars', networks_lower[network] ~ '_cidr') }} routes: {{ lookup('vars', networks_lower[network] ~ '_host_routes') }} {% endfor %} edpm_nodes_validation_validate_controllers_icmp: false edpm_nodes_validation_validate_gateway_icmp: false edpm_os_net_config_mappings: net_config_data_lookup: edpm-compute: nic2: eth1 edpm_sshd_allowed_ranges: - 0.0.0.0/0 enable_debug: false gather_facts: false image_prefix: openstack image_tag: c3923531bcda0b0811b2d5053f189beb neutron_public_interface_name: eth1 registry_url: quay.io/podified-antelope-centos9 timesync_ntp_servers: - hostname: pool.ntp.org ansibleVarsFrom: - configMapRef: name: network-config-template-ipam prefix: edpm_ - configMapRef: name: neutron-edpm-ipam prefix: neutron_ ansibleSSHPrivateKeySecret: dataplane-ansible-ssh-private-key-secret nodes: edpm-compute-0: ansible: ansibleHost: 192.168.122.100 hostName: compute-0 networks: - defaultRoute: false fixedIP: 192.168.122.100 name: ctlplane subnetName: subnet1 - name: internalapi subnetName: subnet1 - name: storage subnetName: subnet1 - name: tenant subnetName: subnet1 preProvisioned: true services: - repo-setup - redhat - bootstrap - download-cache - configure-network - validate-network - install-os - configure-os - ssh-known-hosts - run-os - reboot-os - install-certs - ovn - neutron-metadata - libvirt - nova - telemetry tlsEnabled: true 2026-01-22 16:28:23,604 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Apply dataplane resources but ignore DataPlaneDeployment kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit) }}, context={{ cifmw_openshift_context | default(omit) }}, state=present, definition={{ lookup('file', cifmw_edpm_deploy_crs_kustomize_result.output_path) | from_yaml_all | rejectattr('kind', 'search', cifmw_edpm_deploy_step2_kind) }}] *** 2026-01-22 16:28:23,604 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:00.047) 0:14:13.708 ****** 2026-01-22 16:28:23,604 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:23 +0000 (0:00:00.047) 0:14:13.706 ****** 2026-01-22 16:28:24,478 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:24,489 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Wait for OpenStackDataPlaneNodeSet become SetupReady _raw_params=oc wait OpenStackDataPlaneNodeSet {{ cr_name }} --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for=condition=SetupReady --timeout={{ cifmw_edpm_deploy_timeout }}m] *** 2026-01-22 16:28:24,489 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:24 +0000 (0:00:00.885) 0:14:14.593 ****** 2026-01-22 16:28:24,489 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:24 +0000 (0:00:00.885) 0:14:14.592 ****** 2026-01-22 16:28:25,419 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:25,428 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Apply DataPlaneDeployment resource kubeconfig={{ cifmw_openshift_kubeconfig }}, api_key={{ cifmw_openshift_token | default(omit) }}, context={{ cifmw_openshift_context | default(omit) }}, state=present, definition={{ lookup('file', cifmw_edpm_deploy_crs_kustomize_result.output_path) | from_yaml_all | selectattr('kind', 'search', cifmw_edpm_deploy_step2_kind) }}] *** 2026-01-22 16:28:25,428 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:25 +0000 (0:00:00.939) 0:14:15.532 ****** 2026-01-22 16:28:25,429 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:25 +0000 (0:00:00.939) 0:14:15.531 ****** 2026-01-22 16:28:26,259 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:28:26,268 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Wait for OpenStackDataPlaneDeployment become Ready _raw_params=oc wait OpenStackDataPlaneDeployment {{ cr_name }} --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} --for=condition=Ready --timeout={{ cifmw_edpm_deploy_timeout }}m] *** 2026-01-22 16:28:26,268 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:26 +0000 (0:00:00.839) 0:14:16.372 ****** 2026-01-22 16:28:26,268 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:28:26 +0000 (0:00:00.839) 0:14:16.371 ****** 2026-01-22 16:51:40,898 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:51:40,905 p=31411 u=zuul n=ansible | TASK [edpm_deploy : Run nova-manage discover_hosts to ensure compute nodes are mapped output_dir={{ cifmw_basedir }}/artifacts, executable=/bin/bash, script=set -xe oc rsh --namespace={{ cifmw_install_yamls_defaults['NAMESPACE'] }} nova-cell0-conductor-0 nova-manage cell_v2 discover_hosts --verbose ] *** 2026-01-22 16:51:40,905 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:40 +0000 (0:23:14.636) 0:37:31.009 ****** 2026-01-22 16:51:40,905 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:40 +0000 (0:23:14.636) 0:37:31.007 ****** 2026-01-22 16:51:40,975 p=31411 u=zuul n=ansible | Follow script's output here: /home/zuul/ci-framework-data/logs/ci_script_012_run_nova_manage_discover.log 2026-01-22 16:51:43,675 p=31411 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:51:43,686 p=31411 u=zuul n=ansible | TASK [Validate EDPM name=install_yamls_makes, tasks_from=make_edpm_deploy_instance] *** 2026-01-22 16:51:43,687 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:02.781) 0:37:33.790 ****** 2026-01-22 16:51:43,687 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:02.781) 0:37:33.789 ****** 2026-01-22 16:51:43,714 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:43,756 p=31411 u=zuul n=ansible | PLAY [Deploy NFS server on target nodes] *************************************** 2026-01-22 16:51:43,775 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Set custom cifmw PATH reusable fact cifmw_path={{ ansible_user_dir }}/.crc/bin:{{ ansible_user_dir }}/.crc/bin/oc:{{ ansible_user_dir }}/bin:{{ ansible_env.PATH }}, cacheable=True] *** 2026-01-22 16:51:43,775 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.088) 0:37:33.879 ****** 2026-01-22 16:51:43,775 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.088) 0:37:33.878 ****** 2026-01-22 16:51:43,791 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:43,799 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Install required packages name=['nfs-utils', 'iptables']] **** 2026-01-22 16:51:43,799 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.024) 0:37:33.903 ****** 2026-01-22 16:51:43,799 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.024) 0:37:33.902 ****** 2026-01-22 16:51:43,815 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:43,823 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Configure nfs to use v4 only path=/etc/nfs.conf, section=nfsd, option=vers3, value=n, backup=True, mode=0644] *** 2026-01-22 16:51:43,823 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.023) 0:37:33.927 ****** 2026-01-22 16:51:43,823 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.023) 0:37:33.926 ****** 2026-01-22 16:51:43,839 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:43,846 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Disable NFSv3-related services name={{ item }}, masked=True] *** 2026-01-22 16:51:43,846 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.022) 0:37:33.950 ****** 2026-01-22 16:51:43,846 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.022) 0:37:33.949 ****** 2026-01-22 16:51:43,865 p=31411 u=zuul n=ansible | skipping: [compute-0] => (item=rpc-statd.service) 2026-01-22 16:51:43,869 p=31411 u=zuul n=ansible | skipping: [compute-0] => (item=rpcbind.service) 2026-01-22 16:51:43,872 p=31411 u=zuul n=ansible | skipping: [compute-0] => (item=rpcbind.socket) 2026-01-22 16:51:43,873 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:43,886 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Ensure shared folder exist path=/data/{{ item }}, state=directory, mode=755] *** 2026-01-22 16:51:43,886 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.039) 0:37:33.990 ****** 2026-01-22 16:51:43,886 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.039) 0:37:33.988 ****** 2026-01-22 16:51:43,969 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:43,975 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Set nfs network vars _raw_params=oc get ipset {{ _nfs_host }} -n {{ _ipset_namespace }} -o jsonpath='{.status.reservations[?(@.network=="{{ _nfs_network_name }}")]}'] *** 2026-01-22 16:51:43,975 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.089) 0:37:34.079 ****** 2026-01-22 16:51:43,975 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:43 +0000 (0:00:00.089) 0:37:34.078 ****** 2026-01-22 16:51:43,995 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,003 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Store nfs network vars dest={{ cifmw_basedir }}/artifacts/parameters/nfs-params.yml, content={{ { 'cifmw_nfs_ip': cifmw_nfs_network_out.stdout | from_json | json_query('address'), 'cifmw_nfs_network_range': cifmw_nfs_network_out.stdout | from_json | json_query('cidr') } | to_nice_yaml }}, mode=0644] *** 2026-01-22 16:51:44,003 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.028) 0:37:34.107 ****** 2026-01-22 16:51:44,003 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.028) 0:37:34.106 ****** 2026-01-22 16:51:44,022 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,029 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Generate nftables rules file content=add rule inet filter EDPM_INPUT tcp dport 2049 accept , dest={{ nftables_path }}/nfs-server.nft, mode=0666] *** 2026-01-22 16:51:44,029 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.026) 0:37:34.133 ****** 2026-01-22 16:51:44,030 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.026) 0:37:34.132 ****** 2026-01-22 16:51:44,044 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,052 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Update nftables.conf and include nfs rules at the bottom path={{ nftables_conf }}, line=include "{{ nftables_path }}/nfs-server.nft", insertafter=EOF] *** 2026-01-22 16:51:44,052 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.156 ****** 2026-01-22 16:51:44,052 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.155 ****** 2026-01-22 16:51:44,067 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,076 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Restart nftables service name=nftables, state=restarted] ***** 2026-01-22 16:51:44,076 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.024) 0:37:34.180 ****** 2026-01-22 16:51:44,077 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.024) 0:37:34.179 ****** 2026-01-22 16:51:44,091 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,099 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Configure the ip the nfs server should listen on path=/etc/nfs.conf, section=nfsd, option=host, value={{ cifmw_nfs_network_out.stdout | from_json | json_query('address') }}, backup=True, mode=0644] *** 2026-01-22 16:51:44,099 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.203 ****** 2026-01-22 16:51:44,099 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.202 ****** 2026-01-22 16:51:44,114 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,122 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Enable and restart nfs-server service name=nfs-server, state=restarted, enabled=True] *** 2026-01-22 16:51:44,122 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.023) 0:37:34.226 ****** 2026-01-22 16:51:44,123 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.023) 0:37:34.225 ****** 2026-01-22 16:51:44,138 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,146 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Add shares to /etc/exports path=/etc/exports, line=/data/{{ item }} {{ cifmw_nfs_network_out.stdout | from_json | json_query('cidr') }}(rw,sync,no_root_squash)] *** 2026-01-22 16:51:44,146 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.023) 0:37:34.250 ****** 2026-01-22 16:51:44,146 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.023) 0:37:34.249 ****** 2026-01-22 16:51:44,161 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,169 p=31411 u=zuul n=ansible | TASK [cifmw_nfs : Export the shares _raw_params=exportfs -a] ******************* 2026-01-22 16:51:44,169 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.273 ****** 2026-01-22 16:51:44,169 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.022) 0:37:34.271 ****** 2026-01-22 16:51:44,184 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,217 p=31411 u=zuul n=ansible | PLAY [Clear ceph target hosts facts to force refreshing in HCI deployments] **** 2026-01-22 16:51:44,233 p=31411 u=zuul n=ansible | TASK [Early end if architecture deploy _raw_params=end_play] ******************* 2026-01-22 16:51:44,233 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.064) 0:37:34.337 ****** 2026-01-22 16:51:44,233 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.064) 0:37:34.336 ****** 2026-01-22 16:51:44,243 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,248 p=31411 u=zuul n=ansible | TASK [Clear ceph target hosts facts _raw_params=clear_facts] ******************* 2026-01-22 16:51:44,249 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.015) 0:37:34.352 ****** 2026-01-22 16:51:44,249 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.015) 0:37:34.351 ****** 2026-01-22 16:51:44,256 p=31411 u=zuul n=ansible | skipping: [compute-0] 2026-01-22 16:51:44,282 p=31411 u=zuul n=ansible | PLAY [Deploy ceph using hooks] ************************************************* 2026-01-22 16:51:44,299 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:51:44,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.050) 0:37:34.403 ****** 2026-01-22 16:51:44,299 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.050) 0:37:34.402 ****** 2026-01-22 16:51:44,352 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:44,359 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:51:44,359 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.059) 0:37:34.463 ****** 2026-01-22 16:51:44,359 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.059) 0:37:34.462 ****** 2026-01-22 16:51:44,443 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:44,458 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_ceph _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:51:44,458 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.099) 0:37:34.562 ****** 2026-01-22 16:51:44,459 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.099) 0:37:34.561 ****** 2026-01-22 16:51:44,535 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,567 p=31411 u=zuul n=ansible | PLAY [Continue HCI deploy, deploy architecture and validate workflow] ********** 2026-01-22 16:51:44,597 p=31411 u=zuul n=ansible | TASK [Prepare for HCI deploy phase 2 name=hci_prepare, tasks_from=phase2.yml] *** 2026-01-22 16:51:44,597 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.138) 0:37:34.701 ****** 2026-01-22 16:51:44,597 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.138) 0:37:34.699 ****** 2026-01-22 16:51:44,614 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,623 p=31411 u=zuul n=ansible | TASK [Continue HCI deployment name=edpm_deploy] ******************************** 2026-01-22 16:51:44,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.026) 0:37:34.727 ****** 2026-01-22 16:51:44,623 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.026) 0:37:34.726 ****** 2026-01-22 16:51:44,641 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,652 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:51:44,652 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.028) 0:37:34.756 ****** 2026-01-22 16:51:44,652 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.028) 0:37:34.755 ****** 2026-01-22 16:51:44,721 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:44,730 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:51:44,730 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.077) 0:37:34.834 ****** 2026-01-22 16:51:44,730 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.077) 0:37:34.832 ****** 2026-01-22 16:51:44,813 p=31411 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:44,824 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_deploy _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:51:44,824 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.093) 0:37:34.928 ****** 2026-01-22 16:51:44,824 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.093) 0:37:34.926 ****** 2026-01-22 16:51:44,905 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,921 p=31411 u=zuul n=ansible | TASK [Run validations name=validations] **************************************** 2026-01-22 16:51:44,921 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.097) 0:37:35.025 ****** 2026-01-22 16:51:44,921 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.097) 0:37:35.024 ****** 2026-01-22 16:51:44,940 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,955 p=31411 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:51:44,955 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.033) 0:37:35.059 ****** 2026-01-22 16:51:44,955 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.034) 0:37:35.058 ****** 2026-01-22 16:51:44,972 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:44,980 p=31411 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:51:44,980 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.024) 0:37:35.084 ****** 2026-01-22 16:51:44,980 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:44 +0000 (0:00:00.024) 0:37:35.083 ****** 2026-01-22 16:51:44,997 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:45,006 p=31411 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_deploy _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:51:45,006 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.025) 0:37:35.110 ****** 2026-01-22 16:51:45,006 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.025) 0:37:35.108 ****** 2026-01-22 16:51:45,082 p=31411 u=zuul n=ansible | skipping: [localhost] => (item={'name': '80 Kustomize OpenStack CR', 'type': 'playbook', 'source': 'control_plane_horizon.yml'}) 2026-01-22 16:51:45,083 p=31411 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:45,096 p=31411 u=zuul n=ansible | TASK [Early end if not architecture deploy _raw_params=end_play] *************** 2026-01-22 16:51:45,096 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.090) 0:37:35.200 ****** 2026-01-22 16:51:45,096 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.090) 0:37:35.199 ****** 2026-01-22 16:51:45,105 p=31411 u=zuul n=ansible | PLAY RECAP ********************************************************************* 2026-01-22 16:51:45,105 p=31411 u=zuul n=ansible | compute-0 : ok=0 changed=0 unreachable=0 failed=0 skipped=14 rescued=0 ignored=0 2026-01-22 16:51:45,105 p=31411 u=zuul n=ansible | localhost : ok=210 changed=74 unreachable=0 failed=0 skipped=153 rescued=0 ignored=1 2026-01-22 16:51:45,105 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.008) 0:37:35.209 ****** 2026-01-22 16:51:45,105 p=31411 u=zuul n=ansible | =============================================================================== 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_deploy : Wait for OpenStackDataPlaneDeployment become Ready ----- 1394.64s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_prepare : Wait for OpenStack controlplane to be deployed --------- 339.96s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | install_yamls_makes : Run openstack ----------------------------------- 135.77s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | install_yamls_makes : Run openstack_init ------------------------------- 94.35s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_prepare : Wait for OpenStack subscription creation ---------------- 61.17s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | run_hook : Run hook without retry - Download needed tools -------------- 34.89s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_prepare : Wait for control plane to change its status ------------- 30.06s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_setup : Install needed packages ------------------------------------- 28.23s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | cert_manager : Wait for cert-manager pods to be ready ------------------ 12.27s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | install_yamls_makes : Run edpm_deploy_prep ----------------------------- 10.68s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | run_hook : Run hook without retry - Fetch nodes facts and save them as parameters --- 9.93s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | repo_setup : Initialize python venv and install requirements ------------ 8.83s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_local_storage : Perform action in the PV directory ------------------- 6.28s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | install_yamls_makes : Run netconfig_deploy ------------------------------ 6.13s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_setup : Install openshift client ------------------------------------- 5.26s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | cert_manager : Install cert-manager from release manifest --------------- 3.52s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_local_storage : Fetch hostnames for all hosts ------------------------ 2.83s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_deploy : Run nova-manage discover_hosts to ensure compute nodes are mapped --- 2.78s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | run_hook : Run hook without retry - Tune rabbitmq resources ------------- 2.65s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | openshift_setup : Create required namespaces ---------------------------- 1.76s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | Thursday 22 January 2026 16:51:45 +0000 (0:00:00.009) 0:37:35.209 ****** 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | =============================================================================== 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_deploy ---------------------------------------------------------- 1401.18s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | edpm_prepare ---------------------------------------------------------- 435.51s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | install_yamls_makes --------------------------------------------------- 250.06s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | run_hook --------------------------------------------------------------- 57.41s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_setup --------------------------------------------------------------- 35.82s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | cert_manager ----------------------------------------------------------- 19.98s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | repo_setup ------------------------------------------------------------- 17.27s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | ci_local_storage ------------------------------------------------------- 13.42s 2026-01-22 16:51:45,106 p=31411 u=zuul n=ansible | openshift_setup --------------------------------------------------------- 4.97s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | install_ca -------------------------------------------------------------- 4.19s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | openshift_login --------------------------------------------------------- 3.74s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | install_yamls ----------------------------------------------------------- 3.17s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | cifmw_setup ------------------------------------------------------------- 2.10s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | operator_build ---------------------------------------------------------- 1.35s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | install_openstack_ca ---------------------------------------------------- 1.17s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | gather_facts ------------------------------------------------------------ 1.03s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | edpm_deploy_baremetal --------------------------------------------------- 0.69s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | cifmw_nfs --------------------------------------------------------------- 0.46s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | discover_latest_image --------------------------------------------------- 0.40s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | ansible.builtin.file ---------------------------------------------------- 0.34s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | networking_mapper ------------------------------------------------------- 0.31s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | libvirt_manager --------------------------------------------------------- 0.29s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | pkg_build --------------------------------------------------------------- 0.10s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | ansible.builtin.meta ---------------------------------------------------- 0.08s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | ansible.builtin.include_tasks ------------------------------------------- 0.06s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | ansible.builtin.include_vars -------------------------------------------- 0.05s 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 2026-01-22 16:51:45,107 p=31411 u=zuul n=ansible | total ---------------------------------------------------------------- 2255.17s 2026-01-22 16:51:46,510 p=35978 u=zuul n=ansible | PLAY [Run Post-deployment admin setup steps, test, and compliance scan] ******** 2026-01-22 16:51:46,549 p=35978 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:51:46,549 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.045) 0:00:00.045 ****** 2026-01-22 16:51:46,550 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.044) 0:00:00.044 ****** 2026-01-22 16:51:46,607 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:46,619 p=35978 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:51:46,620 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.070) 0:00:00.115 ****** 2026-01-22 16:51:46,620 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.070) 0:00:00.114 ****** 2026-01-22 16:51:46,694 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:46,706 p=35978 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_admin_setup _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:51:46,706 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.086) 0:00:00.201 ****** 2026-01-22 16:51:46,706 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.086) 0:00:00.200 ****** 2026-01-22 16:51:46,778 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:51:46,800 p=35978 u=zuul n=ansible | TASK [cifmw_setup : Load parameters files dir={{ cifmw_basedir }}/artifacts/parameters] *** 2026-01-22 16:51:46,800 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.094) 0:00:00.296 ****** 2026-01-22 16:51:46,801 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.094) 0:00:00.295 ****** 2026-01-22 16:51:46,837 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:51:46,851 p=35978 u=zuul n=ansible | TASK [os_net_setup : Delete existing subnets _raw_params=set -euxo pipefail if [ $(oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack subnet list --network {{ item.0.name }} -c Name -f value | grep -c {{ item.1.name }}) != 0 ];then oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack subnet delete {{ item.1.name }} fi ] *** 2026-01-22 16:51:46,851 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.050) 0:00:00.346 ****** 2026-01-22 16:51:46,851 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:46 +0000 (0:00:00.050) 0:00:00.346 ****** 2026-01-22 16:51:50,450 p=35978 u=zuul n=ansible | changed: [localhost] => (item=[{'name': 'public', 'external': True, 'shared': False, 'is_default': True, 'provider_network_type': 'flat', 'provider_physical_network': 'datacentre', 'availability_zone_hints': [], 'subnets': [{'name': 'public_subnet', 'cidr': '192.168.122.0/24', 'allocation_pool_start': '192.168.122.171', 'allocation_pool_end': '192.168.122.250', 'gateway_ip': '192.168.122.1', 'enable_dhcp': True}]}, {'name': 'public_subnet', 'cidr': '192.168.122.0/24', 'allocation_pool_start': '192.168.122.171', 'allocation_pool_end': '192.168.122.250', 'gateway_ip': '192.168.122.1', 'enable_dhcp': True}]) 2026-01-22 16:51:50,475 p=35978 u=zuul n=ansible | TASK [os_net_setup : Delete existing subnet pools _raw_params=set -euxo pipefail if [ $(oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack subnet pool list -c Name -f value | grep -c {{ item.name }}) != 0 ];then oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack subnet pool delete {{ item.name }} fi ] *** 2026-01-22 16:51:50,475 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:50 +0000 (0:00:03.624) 0:00:03.971 ****** 2026-01-22 16:51:50,476 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:50 +0000 (0:00:03.624) 0:00:03.970 ****** 2026-01-22 16:51:53,354 p=35978 u=zuul n=ansible | changed: [localhost] => (item={'name': 'shared-pool-ipv4', 'default_prefix_length': 26, 'prefixes': '10.1.0.0/20', 'is_default': True, 'is_shared': True}) 2026-01-22 16:51:56,100 p=35978 u=zuul n=ansible | changed: [localhost] => (item={'name': 'shared-pool-ipv6', 'default_prefix_length': 64, 'prefixes': 'fdfe:381f:8400::/56', 'is_default': True, 'is_shared': True}) 2026-01-22 16:51:56,112 p=35978 u=zuul n=ansible | TASK [os_net_setup : Delete existing networks _raw_params=set -euxo pipefail if [ $(oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack network list -c Name -f value | grep -c {{ item.name }}) != 0 ];then oc exec -n {{ cifmw_os_net_setup_namespace }} openstackclient -- openstack network delete {{ item.name }} fi ] *** 2026-01-22 16:51:56,112 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:56 +0000 (0:00:05.636) 0:00:09.607 ****** 2026-01-22 16:51:56,112 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:56 +0000 (0:00:05.636) 0:00:09.607 ****** 2026-01-22 16:51:59,350 p=35978 u=zuul n=ansible | changed: [localhost] => (item={'name': 'public', 'external': True, 'shared': False, 'is_default': True, 'provider_network_type': 'flat', 'provider_physical_network': 'datacentre', 'availability_zone_hints': [], 'subnets': [{'name': 'public_subnet', 'cidr': '192.168.122.0/24', 'allocation_pool_start': '192.168.122.171', 'allocation_pool_end': '192.168.122.250', 'gateway_ip': '192.168.122.1', 'enable_dhcp': True}]}) 2026-01-22 16:51:59,365 p=35978 u=zuul n=ansible | TASK [os_net_setup : Print network creation commands msg={{ lookup('ansible.builtin.template', _template_file) }}] *** 2026-01-22 16:51:59,365 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:59 +0000 (0:00:03.253) 0:00:12.860 ****** 2026-01-22 16:51:59,365 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:59 +0000 (0:00:03.253) 0:00:12.860 ****** 2026-01-22 16:51:59,426 p=35978 u=zuul n=ansible | ok: [localhost] => msg: | set -euo pipefail oc exec -n openstack openstackclient -- openstack network create \ --external \ --default \ --provider-network-type flat \ --provider-physical-network datacentre \ --no-share \ public 2026-01-22 16:51:59,439 p=35978 u=zuul n=ansible | TASK [os_net_setup : Create networks _raw_params={{ lookup('ansible.builtin.template', _template_file) }} ] *** 2026-01-22 16:51:59,439 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:59 +0000 (0:00:00.073) 0:00:12.934 ****** 2026-01-22 16:51:59,439 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:51:59 +0000 (0:00:00.073) 0:00:12.933 ****** 2026-01-22 16:52:06,517 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:06,531 p=35978 u=zuul n=ansible | TASK [os_net_setup : Print subnet command creation msg={{ lookup('ansible.builtin.template', _template_file) }}] *** 2026-01-22 16:52:06,531 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:06 +0000 (0:00:07.091) 0:00:20.026 ****** 2026-01-22 16:52:06,531 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:06 +0000 (0:00:07.091) 0:00:20.025 ****** 2026-01-22 16:52:06,652 p=35978 u=zuul n=ansible | ok: [localhost] => msg: | set -euo pipefail oc exec -n openstack openstackclient -- openstack subnet create \ --allocation-pool start=192.168.122.171,end=192.168.122.250 \ --subnet-range 192.168.122.0/24 \ --gateway 192.168.122.1 \ --network public \ public_subnet 2026-01-22 16:52:06,662 p=35978 u=zuul n=ansible | TASK [os_net_setup : Create subnets _raw_params={{ lookup('ansible.builtin.template', _template_file) }} ] *** 2026-01-22 16:52:06,662 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:06 +0000 (0:00:00.131) 0:00:20.157 ****** 2026-01-22 16:52:06,662 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:06 +0000 (0:00:00.131) 0:00:20.157 ****** 2026-01-22 16:52:15,664 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:15,691 p=35978 u=zuul n=ansible | TASK [os_net_setup : Print subnet pools command creation msg={{ lookup('ansible.builtin.template', _template_file) }}] *** 2026-01-22 16:52:15,692 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:15 +0000 (0:00:09.029) 0:00:29.187 ****** 2026-01-22 16:52:15,692 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:15 +0000 (0:00:09.029) 0:00:29.186 ****** 2026-01-22 16:52:15,766 p=35978 u=zuul n=ansible | ok: [localhost] => msg: | set -euo pipefail oc exec -n openstack openstackclient -- openstack subnet pool create \ --default-prefix-length 26 \ --pool-prefix 10.1.0.0/20 \ --default \ --share \ shared-pool-ipv4 oc exec -n openstack openstackclient -- openstack subnet pool create \ --default-prefix-length 64 \ --pool-prefix fdfe:381f:8400::/56 \ --default \ --share \ shared-pool-ipv6 2026-01-22 16:52:15,792 p=35978 u=zuul n=ansible | TASK [os_net_setup : Create subnet pools _raw_params={{ lookup('ansible.builtin.template', _template_file) }} ] *** 2026-01-22 16:52:15,792 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:15 +0000 (0:00:00.100) 0:00:29.288 ****** 2026-01-22 16:52:15,793 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:15 +0000 (0:00:00.100) 0:00:29.287 ****** 2026-01-22 16:52:21,506 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:21,538 p=35978 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:52:21,539 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:05.746) 0:00:35.034 ****** 2026-01-22 16:52:21,539 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:05.746) 0:00:35.033 ****** 2026-01-22 16:52:21,606 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:21,621 p=35978 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:52:21,621 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.082) 0:00:35.116 ****** 2026-01-22 16:52:21,621 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.082) 0:00:35.116 ****** 2026-01-22 16:52:21,692 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:21,706 p=35978 u=zuul n=ansible | TASK [run_hook : Loop on hooks for post_admin_setup _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:52:21,706 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.085) 0:00:35.201 ****** 2026-01-22 16:52:21,706 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.085) 0:00:35.201 ****** 2026-01-22 16:52:21,802 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,832 p=35978 u=zuul n=ansible | TASK [Validate required variables are set that=['cifmw_fdp_update_target_package is defined', 'cifmw_fdp_update_target_package | length > 0', 'cifmw_fdp_update_repo_baseurl is defined', 'cifmw_fdp_update_repo_baseurl | length > 0'], fail_msg=Required variables are missing! You must set: - cifmw_fdp_update_target_package: Name of the RPM package to update - cifmw_fdp_update_repo_baseurl: Repository base URL containing the updated package , success_msg=Required variables validated successfully] *** 2026-01-22 16:52:21,832 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.125) 0:00:35.327 ****** 2026-01-22 16:52:21,832 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.125) 0:00:35.327 ****** 2026-01-22 16:52:21,846 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,858 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Validate parameters and initialize _raw_params=validate.yml] *** 2026-01-22 16:52:21,858 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.025) 0:00:35.353 ****** 2026-01-22 16:52:21,858 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.025) 0:00:35.353 ****** 2026-01-22 16:52:21,872 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,906 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Detect OpenShift registry URL _raw_params=detect_registry.yml] *** 2026-01-22 16:52:21,906 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.048) 0:00:35.401 ****** 2026-01-22 16:52:21,906 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.048) 0:00:35.401 ****** 2026-01-22 16:52:21,920 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,930 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Configure registry CA certificate _raw_params=configure_ca_cert.yml] *** 2026-01-22 16:52:21,930 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.024) 0:00:35.426 ****** 2026-01-22 16:52:21,930 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.024) 0:00:35.425 ****** 2026-01-22 16:52:21,944 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,954 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Authenticate with registry _raw_params=authenticate_registry.yml] *** 2026-01-22 16:52:21,954 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.023) 0:00:35.449 ****** 2026-01-22 16:52:21,954 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.023) 0:00:35.449 ****** 2026-01-22 16:52:21,967 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:21,978 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Fetch images to process _raw_params=fetch_images.yml] *** 2026-01-22 16:52:21,978 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.023) 0:00:35.473 ****** 2026-01-22 16:52:21,978 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:21 +0000 (0:00:00.023) 0:00:35.473 ****** 2026-01-22 16:52:21,992 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,002 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Build and push updated images _raw_params=process_image.yml] *** 2026-01-22 16:52:22,002 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.023) 0:00:35.497 ****** 2026-01-22 16:52:22,002 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.023) 0:00:35.497 ****** 2026-01-22 16:52:22,016 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,026 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Display summary msg=['==========================================', 'Container image update complete', 'Target package: {{ cifmw_fdp_update_container_images_target_package }}', 'Images processed: {{ _cifmw_fdp_update_container_images_processed_images }}', "Updated: {{ _cifmw_fdp_update_container_images_updated_cr_keys | join(', ') if _cifmw_fdp_update_container_images_updated_cr_keys | length > 0 else 'None' }}", '==========================================']] *** 2026-01-22 16:52:22,026 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.023) 0:00:35.521 ****** 2026-01-22 16:52:22,026 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.023) 0:00:35.521 ****** 2026-01-22 16:52:22,047 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,058 p=35978 u=zuul n=ansible | TASK [fdp_update_container_images : Cleanup temporary directory path={{ _cifmw_fdp_update_container_images_temp_dir }}, state=absent] *** 2026-01-22 16:52:22,058 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.032) 0:00:35.554 ****** 2026-01-22 16:52:22,058 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.032) 0:00:35.553 ****** 2026-01-22 16:52:22,076 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,095 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Validate parameters and initialize _raw_params=validate.yml] *** 2026-01-22 16:52:22,095 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.036) 0:00:35.590 ****** 2026-01-22 16:52:22,095 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.036) 0:00:35.590 ****** 2026-01-22 16:52:22,112 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,124 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Setup hypervisor firewall for registry access _raw_params=setup_hypervisor_firewall.yml] *** 2026-01-22 16:52:22,125 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.029) 0:00:35.620 ****** 2026-01-22 16:52:22,125 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.029) 0:00:35.619 ****** 2026-01-22 16:52:22,144 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,157 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Fetch EDPM NodeSets _raw_params=fetch_nodesets.yml] **** 2026-01-22 16:52:22,157 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.032) 0:00:35.652 ****** 2026-01-22 16:52:22,157 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.032) 0:00:35.652 ****** 2026-01-22 16:52:22,181 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,192 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Update container images _raw_params=update_container_images.yml] *** 2026-01-22 16:52:22,192 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.035) 0:00:35.687 ****** 2026-01-22 16:52:22,192 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.035) 0:00:35.687 ****** 2026-01-22 16:52:22,206 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,218 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Process each NodeSet _raw_params=process_nodeset.yml] *** 2026-01-22 16:52:22,218 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.026) 0:00:35.713 ****** 2026-01-22 16:52:22,218 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.026) 0:00:35.713 ****** 2026-01-22 16:52:22,228 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,238 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Deploy updates to EDPM nodes _raw_params=create_deployment.yml] *** 2026-01-22 16:52:22,238 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.020) 0:00:35.734 ****** 2026-01-22 16:52:22,238 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.020) 0:00:35.733 ****** 2026-01-22 16:52:22,252 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,263 p=35978 u=zuul n=ansible | TASK [fdp_update_edpm : Display update summary msg=['EDPM Update Summary', 'Updated {{ _cifmw_fdp_update_edpm_updated_nodesets | length }} NodeSet(s): {{ _cifmw_fdp_update_edpm_updated_nodesets }}', 'Container images updated: {{ cifmw_fdp_update_edpm_containers_enabled }}', 'Host packages updated: {{ cifmw_fdp_update_edpm_packages_enabled }}']] *** 2026-01-22 16:52:22,263 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.024) 0:00:35.758 ****** 2026-01-22 16:52:22,263 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.024) 0:00:35.757 ****** 2026-01-22 16:52:22,277 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,296 p=35978 u=zuul n=ansible | TASK [run_hook : Assert parameters are valid quiet=True, that=['_list_hooks is not string', '_list_hooks is not mapping', '_list_hooks is iterable', '(hooks | default([])) is not string', '(hooks | default([])) is not mapping', '(hooks | default([])) is iterable']] *** 2026-01-22 16:52:22,297 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.033) 0:00:35.792 ****** 2026-01-22 16:52:22,297 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.033) 0:00:35.791 ****** 2026-01-22 16:52:22,352 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:22,362 p=35978 u=zuul n=ansible | TASK [run_hook : Assert single hooks are all mappings quiet=True, that=['_not_mapping_hooks | length == 0'], msg=All single hooks must be a list of mappings or a mapping.] *** 2026-01-22 16:52:22,363 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.066) 0:00:35.858 ****** 2026-01-22 16:52:22,363 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.066) 0:00:35.857 ****** 2026-01-22 16:52:22,438 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:22,450 p=35978 u=zuul n=ansible | TASK [run_hook : Loop on hooks for pre_tests _raw_params={{ hook.type }}.yml] *** 2026-01-22 16:52:22,450 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.087) 0:00:35.945 ****** 2026-01-22 16:52:22,450 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.087) 0:00:35.945 ****** 2026-01-22 16:52:22,524 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:22,548 p=35978 u=zuul n=ansible | TASK [tempest : Ensure podman is installed name=podman, state=present] ********* 2026-01-22 16:52:22,548 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.097) 0:00:36.043 ****** 2026-01-22 16:52:22,548 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:22 +0000 (0:00:00.097) 0:00:36.043 ****** 2026-01-22 16:52:23,916 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:23,929 p=35978 u=zuul n=ansible | TASK [tempest : Create tempest directories path={{ cifmw_tempest_artifacts_basedir }}, state=directory, mode=0755] *** 2026-01-22 16:52:23,929 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:23 +0000 (0:00:01.381) 0:00:37.424 ****** 2026-01-22 16:52:23,929 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:23 +0000 (0:00:01.381) 0:00:37.424 ****** 2026-01-22 16:52:24,194 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:24,204 p=35978 u=zuul n=ansible | TASK [tempest : Setup tempest tests _raw_params=tempest-tests.yml] ************* 2026-01-22 16:52:24,204 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.274) 0:00:37.699 ****** 2026-01-22 16:52:24,204 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.274) 0:00:37.698 ****** 2026-01-22 16:52:24,235 p=35978 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/tempest/tasks/tempest-tests.yml for localhost 2026-01-22 16:52:24,251 p=35978 u=zuul n=ansible | TASK [tempest : Copy list_allowed to artifacts dir mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/list_allowed.yml, src=list_allowed.yml] *** 2026-01-22 16:52:24,251 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.047) 0:00:37.746 ****** 2026-01-22 16:52:24,251 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.047) 0:00:37.745 ****** 2026-01-22 16:52:24,271 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,280 p=35978 u=zuul n=ansible | TASK [tempest : Get list of tests to be executed yaml_file={{ cifmw_tempest_artifacts_basedir }}/list_allowed.yml, groups={{ cifmw_tempest_default_groups }}, job={{ cifmw_tempest_job_name | default(omit) }}] *** 2026-01-22 16:52:24,280 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.029) 0:00:37.775 ****** 2026-01-22 16:52:24,280 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.029) 0:00:37.775 ****** 2026-01-22 16:52:24,300 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,311 p=35978 u=zuul n=ansible | TASK [tempest : Creating include.txt mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/include.txt, content={% for test in list_allowed.allowed_tests %}{{ test }} {% endfor %}] *** 2026-01-22 16:52:24,311 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.030) 0:00:37.806 ****** 2026-01-22 16:52:24,311 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.030) 0:00:37.805 ****** 2026-01-22 16:52:24,330 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,340 p=35978 u=zuul n=ansible | TASK [tempest : Show tests to be executed msg={{ list_allowed }}] ************** 2026-01-22 16:52:24,340 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.029) 0:00:37.835 ****** 2026-01-22 16:52:24,340 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.029) 0:00:37.835 ****** 2026-01-22 16:52:24,360 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,372 p=35978 u=zuul n=ansible | TASK [tempest : Copy list_skipped to artifacts dir mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/list_skipped.yml, src=list_skipped.yml] *** 2026-01-22 16:52:24,372 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.032) 0:00:37.867 ****** 2026-01-22 16:52:24,372 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.032) 0:00:37.867 ****** 2026-01-22 16:52:24,393 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,411 p=35978 u=zuul n=ansible | TASK [tempest : Get list of tests to be excluded yaml_file={{ cifmw_tempest_artifacts_basedir }}/list_skipped.yml, jobs={{ cifmw_tempest_default_jobs }}] *** 2026-01-22 16:52:24,411 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.038) 0:00:37.906 ****** 2026-01-22 16:52:24,411 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.038) 0:00:37.906 ****** 2026-01-22 16:52:24,436 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,447 p=35978 u=zuul n=ansible | TASK [tempest : Creating exclude.txt mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/exclude.txt, content={% for test in list_skipped.skipped_tests %}{{ test }} {% endfor %}] *** 2026-01-22 16:52:24,447 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.035) 0:00:37.942 ****** 2026-01-22 16:52:24,447 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.035) 0:00:37.941 ****** 2026-01-22 16:52:24,473 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,483 p=35978 u=zuul n=ansible | TASK [tempest : Show tests to be excluded msg={{ list_skipped.skipped_tests }}] *** 2026-01-22 16:52:24,483 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.036) 0:00:37.978 ****** 2026-01-22 16:52:24,483 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.036) 0:00:37.978 ****** 2026-01-22 16:52:24,506 p=35978 u=zuul n=ansible | skipping: [localhost] 2026-01-22 16:52:24,519 p=35978 u=zuul n=ansible | TASK [tempest : Creating include.txt mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/include.txt, content={% for test in cifmw_tempest_tests_allowed %}{{ test }} {% endfor %}] *** 2026-01-22 16:52:24,519 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.036) 0:00:38.015 ****** 2026-01-22 16:52:24,519 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:24 +0000 (0:00:00.036) 0:00:38.014 ****** 2026-01-22 16:52:25,159 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:25,172 p=35978 u=zuul n=ansible | TASK [tempest : Show tests to be executed msg={{ cifmw_tempest_tests_allowed }}] *** 2026-01-22 16:52:25,172 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.652) 0:00:38.667 ****** 2026-01-22 16:52:25,172 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.652) 0:00:38.667 ****** 2026-01-22 16:52:25,200 p=35978 u=zuul n=ansible | ok: [localhost] => msg: - neutron_tempest_plugin.api - neutron_tempest_plugin.scenario - tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops 2026-01-22 16:52:25,214 p=35978 u=zuul n=ansible | TASK [tempest : Creating exclude.txt mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/exclude.txt, content={% for test in cifmw_tempest_tests_skipped %}{{ test }} {% endfor %}] *** 2026-01-22 16:52:25,214 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.042) 0:00:38.710 ****** 2026-01-22 16:52:25,215 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.042) 0:00:38.709 ****** 2026-01-22 16:52:25,653 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:25,663 p=35978 u=zuul n=ansible | TASK [tempest : Show tests to be excluded msg={{ cifmw_tempest_tests_skipped }}] *** 2026-01-22 16:52:25,663 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.448) 0:00:39.158 ****** 2026-01-22 16:52:25,663 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.448) 0:00:39.157 ****** 2026-01-22 16:52:25,688 p=35978 u=zuul n=ansible | ok: [localhost] => msg: - neutron_tempest_plugin.scenario.test_mtu.NetworkWritableMtuTest - test_qos_dscp_create_and_update - NetworkSecGroupTest 2026-01-22 16:52:25,697 p=35978 u=zuul n=ansible | TASK [tempest : Create clouds.yaml _raw_params=create-clouds-file.yml] ********* 2026-01-22 16:52:25,697 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.034) 0:00:39.193 ****** 2026-01-22 16:52:25,697 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.034) 0:00:39.192 ****** 2026-01-22 16:52:25,733 p=35978 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/tempest/tasks/create-clouds-file.yml for localhost 2026-01-22 16:52:25,755 p=35978 u=zuul n=ansible | TASK [tempest : Get keystone data _raw_params=oc get keystoneapi keystone -n {{ cifmw_openstack_namespace }} -o json] *** 2026-01-22 16:52:25,755 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.057) 0:00:39.250 ****** 2026-01-22 16:52:25,755 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:25 +0000 (0:00:00.057) 0:00:39.250 ****** 2026-01-22 16:52:26,133 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:26,143 p=35978 u=zuul n=ansible | TASK [tempest : Set keystone vars keystone_secret_name={{ _keystone_json_content['spec']['secret'] }}, keystone_passwd_select={{ _keystone_json_content['spec']['passwordSelectors']['admin'] }}, keystone_api={{ _keystone_json_content['status']['apiEndpoints']['public'] }}] *** 2026-01-22 16:52:26,143 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.387) 0:00:39.638 ****** 2026-01-22 16:52:26,143 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.387) 0:00:39.637 ****** 2026-01-22 16:52:26,175 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:26,186 p=35978 u=zuul n=ansible | TASK [tempest : Get credentials data _raw_params=oc get secret {{ keystone_secret_name }} -n {{ cifmw_openstack_namespace }} -o json] *** 2026-01-22 16:52:26,186 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.042) 0:00:39.681 ****** 2026-01-22 16:52:26,186 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.042) 0:00:39.680 ****** 2026-01-22 16:52:26,549 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:26,559 p=35978 u=zuul n=ansible | TASK [tempest : Get password data os_password={{ _os_password_data_json_content['data'][keystone_passwd_select] | ansible.builtin.b64decode }}] *** 2026-01-22 16:52:26,559 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.373) 0:00:40.054 ****** 2026-01-22 16:52:26,559 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.373) 0:00:40.054 ****** 2026-01-22 16:52:26,580 p=35978 u=zuul n=ansible | ok: [localhost] 2026-01-22 16:52:26,590 p=35978 u=zuul n=ansible | TASK [tempest : Get clouds.yaml src=clouds.yaml.j2, dest={{ cifmw_tempest_artifacts_basedir }}/clouds.yaml, mode=0644] *** 2026-01-22 16:52:26,590 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.031) 0:00:40.085 ****** 2026-01-22 16:52:26,590 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:26 +0000 (0:00:00.031) 0:00:40.085 ****** 2026-01-22 16:52:27,023 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:27,036 p=35978 u=zuul n=ansible | TASK [tempest : Configure tempest _raw_params=configure-tempest.yml] *********** 2026-01-22 16:52:27,036 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.446) 0:00:40.532 ****** 2026-01-22 16:52:27,036 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.446) 0:00:40.531 ****** 2026-01-22 16:52:27,069 p=35978 u=zuul n=ansible | included: /home/zuul/src/github.com/openstack-k8s-operators/ci-framework/roles/tempest/tasks/configure-tempest.yml for localhost 2026-01-22 16:52:27,092 p=35978 u=zuul n=ansible | TASK [tempest : Create profile.yaml file content={{ _cifmw_tempest_tempestconf_profile_content | to_nice_yaml }}, dest={{ cifmw_tempest_artifacts_basedir }}/profile.yaml, mode=0644] *** 2026-01-22 16:52:27,092 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.055) 0:00:40.587 ****** 2026-01-22 16:52:27,092 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.055) 0:00:40.587 ****** 2026-01-22 16:52:27,473 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:27,483 p=35978 u=zuul n=ansible | TASK [tempest : Copy CA bundle to cifmw_tempest_artifacts_basedir src=/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem, dest={{ cifmw_tempest_artifacts_basedir }}, mode=0444, owner={{ ansible_user | default(lookup('env', 'USER')) }}, group={{ ansible_user | default(lookup('env', 'USER')) }}, remote_src=True] *** 2026-01-22 16:52:27,483 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.391) 0:00:40.979 ****** 2026-01-22 16:52:27,483 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.391) 0:00:40.978 ****** 2026-01-22 16:52:27,701 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:27,711 p=35978 u=zuul n=ansible | TASK [tempest : Set proper permission for tempest directory _raw_params=podman unshare chown 42480:42480 -R {{ cifmw_tempest_artifacts_basedir }}] *** 2026-01-22 16:52:27,711 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.227) 0:00:41.206 ****** 2026-01-22 16:52:27,711 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:27 +0000 (0:00:00.227) 0:00:41.205 ****** 2026-01-22 16:52:27,987 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:52:28,008 p=35978 u=zuul n=ansible | TASK [tempest : Ensure we have tempest container image name={{ cifmw_tempest_image }}:{{ cifmw_tempest_image_tag }}] *** 2026-01-22 16:52:28,008 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:28 +0000 (0:00:00.297) 0:00:41.503 ****** 2026-01-22 16:52:28,008 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:52:28 +0000 (0:00:00.297) 0:00:41.502 ****** 2026-01-22 16:53:16,565 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 16:53:16,576 p=35978 u=zuul n=ansible | TASK [tempest : Run tempest name=tempest, image={{ cifmw_tempest_image }}:{{ cifmw_tempest_image_tag }}, state=started, auto_remove={{ cifmw_tempest_remove_container }}, network=host, volume=['{{ cifmw_tempest_artifacts_basedir }}/:/var/lib/tempest/external_files:Z', '{{ cifmw_tempest_artifacts_basedir }}/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:Z'], detach=False, dns={{ cifmw_tempest_dns_servers }}, env={'CONCURRENCY': '{{ cifmw_tempest_concurrency | default(omit) }}'}] *** 2026-01-22 16:53:16,576 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:53:16 +0000 (0:00:48.567) 0:01:30.071 ****** 2026-01-22 16:53:16,576 p=35978 u=zuul n=ansible | Thursday 22 January 2026 16:53:16 +0000 (0:00:48.567) 0:01:30.070 ****** 2026-01-22 18:07:46,834 p=35978 u=zuul n=ansible | fatal: [localhost]: FAILED! => changed: false msg: Container tempest exited with code 1 when runed stderr: "+ RETURN_VALUE=0\n+ HOMEDIR=/var/lib/tempest\n+ TEMPEST_PATH=/var/lib/tempest/\n+ TEMPEST_DIR=/var/lib/tempest/openshift\n+ CONCURRENCY=4\n+ TEMPESTCONF_ARGS=\n+ TEMPEST_ARGS=\n+ TEMPEST_DEBUG_MODE=false\n+ TEMPEST_CLEANUP=false\n+ RERUN_FAILED_TESTS=false\n+ RERUN_OVERRIDE_STATUS=false\n+ TEMPEST_EXPECTED_FAILURES_LIST=/dev/null\n+ '[' false == true ']'\n+ [[ -z '' ]]\n+ TEMPEST_WORKFLOW_STEP_DIR_NAME=tempest\n+ [[ ! -z true ]]\n+ TEMPEST_PATH=/var/lib/tempest/external_files/\n+ TEMPEST_LOGS_DIR=/var/lib/tempest/external_files//tempest/\n+ FAILED_TESTS_FILE=/var/lib/tempest/external_files//tempest//stestr_failing.txt\n+ [[ true == true ]]\n+ TEMPESTCONF_ARGS+='--create '\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ true == true ]]\n+ TEMPESTCONF_ARGS+='--debug '\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n++ echo\n++ tr '\\n' ' '\n+ TEMPESTCONF_OVERRIDES=' identity.v3_endpoint_type public '\n+ TEMPESTCONF_OVERRIDES+='DEFAULT.log_dir /var/lib/tempest/external_files//tempest/ '\n+ [[ ! -z /usr/libexec/octavia-tempest-plugin-tests-httpd ]]\n+ TEMPESTCONF_OVERRIDES+='load_balancer.test_server_path /usr/libexec/octavia-tempest-plugin-tests-httpd '\n+ TEMPEST_EXTERNAL_PLUGIN_GIT_URL=\n+ TEMPEST_EXTERNAL_PLUGIN_CHANGE_URL=\n+ TEMPEST_EXTERNAL_PLUGIN_REFSPEC=\n+ TEMPEST_EXTERNAL_PLUGIN_DIR=/var/lib/tempest/external-plugins\n+ VENV_DIR=/var/lib/tempest/external-plugins/.venv\n+ TEMPEST_EXTRA_RPMS=\n+ TEMPEST_EXTRA_IMAGES_URL=\n+ TEMPEST_EXTRA_IMAGES_DISK_FORMAT=\n+ TEMPEST_EXTRA_IMAGES_OS_CLOUD=\n+ TEMPEST_EXTRA_IMAGES_ID=\n+ TEMPEST_EXTRA_IMAGES_NAME=\n+ TEMPEST_EXTRA_IMAGES_CONTAINER_FORMAT=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_ID=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_RAM=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_DISK=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_VCPUS=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_NAME=\n+ TEMPEST_EXTRA_IMAGES_FLAVOR_OS_CLOUD=\n+ TEMPEST_EXTRA_IMAGES_CREATE_TIMEOUT=\n+ OLD_IFS=' \t\n'\n+ IFS=,\n+ read -ra TEMPEST_EXTERNAL_PLUGIN_GIT_URL\n+ read -ra TEMPEST_EXTERNAL_PLUGIN_CHANGE_URL\n+ read -ra TEMPEST_EXTERNAL_PLUGIN_REFSPEC\n+ read -ra TEMPEST_EXTRA_RPMS\n+ read -ra TEMPEST_EXTRA_IMAGES_URL\n+ read -ra TEMPEST_EXTRA_IMAGES_DISK_FORMAT\n+ read -ra TEMPEST_EXTRA_IMAGES_OS_CLOUD\n+ read -ra TEMPEST_EXTRA_IMAGES_ID\n+ read -ra TEMPEST_EXTRA_IMAGES_NAME\n+ read -ra TEMPEST_EXTRA_IMAGES_CONTAINER_FORMAT\n+ read -ra TEMPEST_EXTRA_IMAGES_CREATE_TIMEOUT\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_ID\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_RAM\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_DISK\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_VCPUS\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_NAME\n+ read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_OS_CLOUD\n+ IFS=' \t\n'\n+ [[ '' == true ]]\n+ [[ '' == true ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ [[ -z '' ]]\n+ TEMPEST_ARGS+='--include-list /var/lib/tempest/external_files//include.txt '\n+ [[ -z '' ]]\n+ TEMPEST_ARGS+='--exclude-list /var/lib/tempest/external_files//exclude.txt '\n+ [[ ! -z '' ]]\n+ [[ ! -z '' ]]\n+ '[' -n 4 ']'\n+ '[' -z '' ']'\n+ TEMPEST_ARGS+='--concurrency 4 '\n+ export OS_CLOUD=default\n+ OS_CLOUD=default\n+ '[' '!' -z true ']'\n+ '[' -e /var/lib/tempest/external_files//clouds.yaml ']'\n+ mkdir -p /var/lib/tempest/.config/openstack\n+ cp /var/lib/tempest/external_files//clouds.yaml /var/lib/tempest/.config/openstack/clouds.yaml\n+ '[' -f /var/lib/tempest/external_files//profile.yaml ']'\n+ '[' -z '' ']'\n+ TEMPESTCONF_ARGS+='--profile /var/lib/tempest/external_files//profile.yaml '\n+ '[' '!' -f /var/lib/tempest/external_files//include.txt ']'\n+ '[' '!' -f /var/lib/tempest/external_files//exclude.txt ']'\n+ whitebox_neutron_tempest_plugin_workaround\n+ '[' -f /var/lib/tempest/id_ecdsa ']'\n+ '[' -z '' ']'\n+ run_rpm_tempest\n+ '[' 0 -ne 0 ']'\n+ rpm -qa\n+ grep tempest\n+ run_tempest\n+ pushd /var/lib/tempest\n+ tempest init openshift\n+ pushd /var/lib/tempest/openshift\n+ prepare_tempest_cleanup\n+ [[ false == true ]]\n+ upload_extra_images\n+ [[ -n '' ]]\n+ mkdir -p /var/lib/tempest/external_files//tempest/\n+ discover_tempest_config --create --debug --profile /var/lib/tempest/external_files//profile.yaml identity.v3_endpoint_type public DEFAULT.log_dir /var/lib/tempest/external_files//tempest/ load_balancer.test_server_path /usr/libexec/octavia-tempest-plugin-tests-httpd\n+ cat\n+ xargs discover-tempest-config\n/usr/lib/python3.9/site-packages/urllib3/connectionpool.py:1018: InsecureRequestWarning: Unverified HTTPS request is being made to host 'keystone-public-openstack.apps-crc.testing'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings\n \ warnings.warn(\n+ tempest run --include-list /var/lib/tempest/external_files//include.txt --exclude-list /var/lib/tempest/external_files//exclude.txt --concurrency 4\n+ RETURN_VALUE=1\n+ run_tempest_cleanup\n+ [[ false == true ]]\n+ popd\n+ popd\n+ print_config_files\n+ echo 'Excluded tests'\n+ '[' '!' -z '' ']'\n+ echo 'Included tests'\n+ '[' '!' -z '' ']'\n+ save_config_files\n+ mkdir -p /var/lib/tempest/external_files//tempest//etc\n+ cp -f '' /var/lib/tempest/external_files//tempest//etc\ncp: cannot stat '': No such file or directory\n+ cp -f '' /var/lib/tempest/external_files//tempest//etc\ncp: cannot stat '': No such file or directory\n+ cp -f /dev/null /var/lib/tempest/external_files//tempest//etc\n+ cp -f /var/lib/tempest/openshift/etc/tempest.conf '/var/lib/tempest/openshift/etc/*.ini' '/var/lib/tempest/openshift/etc/*.txt' /var/lib/tempest/openshift/etc/allow-list.yaml /var/lib/tempest/external_files//tempest//etc\ncp: cannot stat '/var/lib/tempest/openshift/etc/*.ini': No such file or directory\ncp: cannot stat '/var/lib/tempest/openshift/etc/*.txt': No such file or directory\n+ cp -f /var/lib/tempest/openshift/.stestr.conf /var/lib/tempest/external_files//tempest//stestr.conf\n+ tar -czf /var/lib/tempest/external_files//tempest//stestr.tar.gz -C /var/lib/tempest/openshift .stestr\n+ move_tempest_log tempest_results.log\n+ _FILENAME=tempest_results.log\n+ mv /var/lib/tempest/external_files//tempest//tempest.log /var/lib/tempest/external_files//tempest//tempest_results.log\nmv: cannot stat '/var/lib/tempest/external_files//tempest//tempest.log': No such file or directory\n+ generate_test_results tempest_results\n+ _FILENAME=tempest_results\n+ _SUBUNIT_FILE=/var/lib/tempest/external_files//tempest//tempest_results.subunit\n+ _RESULTS_XML=/var/lib/tempest/external_files//tempest//tempest_results.xml\n+ _RESULTS_HTML=/var/lib/tempest/external_files//tempest//tempest_results.html\n+ pushd /var/lib/tempest/openshift\n+ echo 'Generate file containing failing tests'\n+ stestr failing --list\n+ sed 's/\\[.*\\]//g'\n+ echo 'Generate subunit, then xml and html results'\n+ stestr last --subunit\n+ subunit2junitxml /var/lib/tempest/external_files//tempest//tempest_results.subunit -o /var/lib/tempest/external_files//tempest//tempest_results.xml\n+ subunit2html /var/lib/tempest/external_files//tempest//tempest_results.subunit /var/lib/tempest/external_files//tempest//tempest_results.html\n+ popd\n+ rerun_failed_tests\n+ '[' false = false ']'\n+ return 1\n+ check_expected_failures\n+ '[' -s /var/lib/tempest/external_files//tempest//stestr_failing.txt ']'\n+ '[' -s /dev/null ']'\n+ '[' false == true ']'\n+ exit 1\n" stderr_lines: - + RETURN_VALUE=0 - + HOMEDIR=/var/lib/tempest - + TEMPEST_PATH=/var/lib/tempest/ - + TEMPEST_DIR=/var/lib/tempest/openshift - + CONCURRENCY=4 - + TEMPESTCONF_ARGS= - + TEMPEST_ARGS= - + TEMPEST_DEBUG_MODE=false - + TEMPEST_CLEANUP=false - + RERUN_FAILED_TESTS=false - + RERUN_OVERRIDE_STATUS=false - + TEMPEST_EXPECTED_FAILURES_LIST=/dev/null - + '[' false == true ']' - + [[ -z '' ]] - + TEMPEST_WORKFLOW_STEP_DIR_NAME=tempest - + [[ ! -z true ]] - + TEMPEST_PATH=/var/lib/tempest/external_files/ - + TEMPEST_LOGS_DIR=/var/lib/tempest/external_files//tempest/ - + FAILED_TESTS_FILE=/var/lib/tempest/external_files//tempest//stestr_failing.txt - + [[ true == true ]] - + TEMPESTCONF_ARGS+='--create ' - + [[ '' == true ]] - + [[ '' == true ]] - + [[ '' == true ]] - + [[ true == true ]] - + TEMPESTCONF_ARGS+='--debug ' - + [[ '' == true ]] - + [[ '' == true ]] - + [[ '' == true ]] - + [[ '' == true ]] - + [[ '' == true ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - ++ echo - ++ tr '\n' ' ' - + TEMPESTCONF_OVERRIDES=' identity.v3_endpoint_type public ' - + TEMPESTCONF_OVERRIDES+='DEFAULT.log_dir /var/lib/tempest/external_files//tempest/ ' - + [[ ! -z /usr/libexec/octavia-tempest-plugin-tests-httpd ]] - + TEMPESTCONF_OVERRIDES+='load_balancer.test_server_path /usr/libexec/octavia-tempest-plugin-tests-httpd ' - + TEMPEST_EXTERNAL_PLUGIN_GIT_URL= - + TEMPEST_EXTERNAL_PLUGIN_CHANGE_URL= - + TEMPEST_EXTERNAL_PLUGIN_REFSPEC= - + TEMPEST_EXTERNAL_PLUGIN_DIR=/var/lib/tempest/external-plugins - + VENV_DIR=/var/lib/tempest/external-plugins/.venv - + TEMPEST_EXTRA_RPMS= - + TEMPEST_EXTRA_IMAGES_URL= - + TEMPEST_EXTRA_IMAGES_DISK_FORMAT= - + TEMPEST_EXTRA_IMAGES_OS_CLOUD= - + TEMPEST_EXTRA_IMAGES_ID= - + TEMPEST_EXTRA_IMAGES_NAME= - + TEMPEST_EXTRA_IMAGES_CONTAINER_FORMAT= - + TEMPEST_EXTRA_IMAGES_FLAVOR_ID= - + TEMPEST_EXTRA_IMAGES_FLAVOR_RAM= - + TEMPEST_EXTRA_IMAGES_FLAVOR_DISK= - + TEMPEST_EXTRA_IMAGES_FLAVOR_VCPUS= - + TEMPEST_EXTRA_IMAGES_FLAVOR_NAME= - + TEMPEST_EXTRA_IMAGES_FLAVOR_OS_CLOUD= - + TEMPEST_EXTRA_IMAGES_CREATE_TIMEOUT= - "+ OLD_IFS=' \t" - '''' - + IFS=, - + read -ra TEMPEST_EXTERNAL_PLUGIN_GIT_URL - + read -ra TEMPEST_EXTERNAL_PLUGIN_CHANGE_URL - + read -ra TEMPEST_EXTERNAL_PLUGIN_REFSPEC - + read -ra TEMPEST_EXTRA_RPMS - + read -ra TEMPEST_EXTRA_IMAGES_URL - + read -ra TEMPEST_EXTRA_IMAGES_DISK_FORMAT - + read -ra TEMPEST_EXTRA_IMAGES_OS_CLOUD - + read -ra TEMPEST_EXTRA_IMAGES_ID - + read -ra TEMPEST_EXTRA_IMAGES_NAME - + read -ra TEMPEST_EXTRA_IMAGES_CONTAINER_FORMAT - + read -ra TEMPEST_EXTRA_IMAGES_CREATE_TIMEOUT - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_ID - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_RAM - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_DISK - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_VCPUS - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_NAME - + read -ra TEMPEST_EXTRA_IMAGES_FLAVOR_OS_CLOUD - "+ IFS=' \t" - '''' - + [[ '' == true ]] - + [[ '' == true ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + [[ -z '' ]] - + TEMPEST_ARGS+='--include-list /var/lib/tempest/external_files//include.txt ' - + [[ -z '' ]] - + TEMPEST_ARGS+='--exclude-list /var/lib/tempest/external_files//exclude.txt ' - + [[ ! -z '' ]] - + [[ ! -z '' ]] - + '[' -n 4 ']' - + '[' -z '' ']' - + TEMPEST_ARGS+='--concurrency 4 ' - + export OS_CLOUD=default - + OS_CLOUD=default - + '[' '!' -z true ']' - + '[' -e /var/lib/tempest/external_files//clouds.yaml ']' - + mkdir -p /var/lib/tempest/.config/openstack - + cp /var/lib/tempest/external_files//clouds.yaml /var/lib/tempest/.config/openstack/clouds.yaml - + '[' -f /var/lib/tempest/external_files//profile.yaml ']' - + '[' -z '' ']' - + TEMPESTCONF_ARGS+='--profile /var/lib/tempest/external_files//profile.yaml ' - + '[' '!' -f /var/lib/tempest/external_files//include.txt ']' - + '[' '!' -f /var/lib/tempest/external_files//exclude.txt ']' - + whitebox_neutron_tempest_plugin_workaround - + '[' -f /var/lib/tempest/id_ecdsa ']' - + '[' -z '' ']' - + run_rpm_tempest - + '[' 0 -ne 0 ']' - + rpm -qa - + grep tempest - + run_tempest - + pushd /var/lib/tempest - + tempest init openshift - + pushd /var/lib/tempest/openshift - + prepare_tempest_cleanup - + [[ false == true ]] - + upload_extra_images - + [[ -n '' ]] - + mkdir -p /var/lib/tempest/external_files//tempest/ - + discover_tempest_config --create --debug --profile /var/lib/tempest/external_files//profile.yaml identity.v3_endpoint_type public DEFAULT.log_dir /var/lib/tempest/external_files//tempest/ load_balancer.test_server_path /usr/libexec/octavia-tempest-plugin-tests-httpd - + cat - + xargs discover-tempest-config - '/usr/lib/python3.9/site-packages/urllib3/connectionpool.py:1018: InsecureRequestWarning: Unverified HTTPS request is being made to host ''keystone-public-openstack.apps-crc.testing''. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings' - ' warnings.warn(' - + tempest run --include-list /var/lib/tempest/external_files//include.txt --exclude-list /var/lib/tempest/external_files//exclude.txt --concurrency 4 - + RETURN_VALUE=1 - + run_tempest_cleanup - + [[ false == true ]] - + popd - + popd - + print_config_files - + echo 'Excluded tests' - + '[' '!' -z '' ']' - + echo 'Included tests' - + '[' '!' -z '' ']' - + save_config_files - + mkdir -p /var/lib/tempest/external_files//tempest//etc - + cp -f '' /var/lib/tempest/external_files//tempest//etc - 'cp: cannot stat '''': No such file or directory' - + cp -f '' /var/lib/tempest/external_files//tempest//etc - 'cp: cannot stat '''': No such file or directory' - + cp -f /dev/null /var/lib/tempest/external_files//tempest//etc - + cp -f /var/lib/tempest/openshift/etc/tempest.conf '/var/lib/tempest/openshift/etc/*.ini' '/var/lib/tempest/openshift/etc/*.txt' /var/lib/tempest/openshift/etc/allow-list.yaml /var/lib/tempest/external_files//tempest//etc - 'cp: cannot stat ''/var/lib/tempest/openshift/etc/*.ini'': No such file or directory' - 'cp: cannot stat ''/var/lib/tempest/openshift/etc/*.txt'': No such file or directory' - + cp -f /var/lib/tempest/openshift/.stestr.conf /var/lib/tempest/external_files//tempest//stestr.conf - + tar -czf /var/lib/tempest/external_files//tempest//stestr.tar.gz -C /var/lib/tempest/openshift .stestr - + move_tempest_log tempest_results.log - + _FILENAME=tempest_results.log - + mv /var/lib/tempest/external_files//tempest//tempest.log /var/lib/tempest/external_files//tempest//tempest_results.log - 'mv: cannot stat ''/var/lib/tempest/external_files//tempest//tempest.log'': No such file or directory' - + generate_test_results tempest_results - + _FILENAME=tempest_results - + _SUBUNIT_FILE=/var/lib/tempest/external_files//tempest//tempest_results.subunit - + _RESULTS_XML=/var/lib/tempest/external_files//tempest//tempest_results.xml - + _RESULTS_HTML=/var/lib/tempest/external_files//tempest//tempest_results.html - + pushd /var/lib/tempest/openshift - + echo 'Generate file containing failing tests' - + stestr failing --list - + sed 's/\[.*\]//g' - + echo 'Generate subunit, then xml and html results' - + stestr last --subunit - + subunit2junitxml /var/lib/tempest/external_files//tempest//tempest_results.subunit -o /var/lib/tempest/external_files//tempest//tempest_results.xml - + subunit2html /var/lib/tempest/external_files//tempest//tempest_results.subunit /var/lib/tempest/external_files//tempest//tempest_results.html - + popd - + rerun_failed_tests - + '[' false = false ']' - + return 1 - + check_expected_failures - + '[' -s /var/lib/tempest/external_files//tempest//stestr_failing.txt ']' - + '[' -s /dev/null ']' - + '[' false == true ']' - + exit 1 stdout: "python3-tempest-41.0.0-0.20250124132801.a25e0df.el9.noarch\npython3-tempestconf-3.5.3-0.20250819134715.8515371.el9.noarch\nopenstack-tempest-41.0.0-0.20250124132801.a25e0df.el9.noarch\npython3-watcher-tests-tempest-3.0.0-0.20240131100157.92ca984.el9.noarch\npython3-designate-tests-tempest-0.22.0-0.20240409063647.347fdbc.el9.noarch\npython3-manila-tests-tempest-2.4.0-0.20240730171324.d9530e0.el9.noarch\npython3-keystone-tests-tempest-0.16.0-0.20240528071825.63cfcb9.el9.noarch\npython3-whitebox-tests-tempest-0.0.3-0.20240412161827.766ff04.el9.noarch\npython3-murano-tests-tempest-2.7.0-0.20240131092708.d2b794c.el9.noarch\npython3-trove-tests-tempest-2.2.0-0.20240131093157.d63e17a.el9.noarch\npython3-mistral-tests-tempest-2.2.0-0.20240131094539.2f92367.el9.noarch\npython3-kuryr-tests-tempest-0.15.1-0.20240131095631.ab45b2f.el9.noarch\npython3-whitebox-neutron-tests-tempest-0.9.2-0.20251111185731.12cf06c.el9.noarch\npython3-zaqar-tests-tempest-1.7.0-0.20240131094344.3813c99.el9.noarch\npython3-magnum-tests-tempest-2.1.0-0.20240131093411.ef90336.el9.noarch\npython3-octavia-tests-tempest-golang-2.6.0-0.20240409063333.a1a2bed.el9.x86_64\npython3-octavia-tests-tempest-2.6.0-0.20240409063333.a1a2bed.el9.noarch\npython3-glance-tests-tempest-0.7.0-0.20240131091807.d6f7287.el9.noarch\npython3-heat-tests-tempest-2.1.0-0.20240409061406.5a48492.el9.noarch\npython3-telemetry-tests-tempest-2.5.1-0.20250603080835.ddfb79a.el9.noarch\npython3-neutron-tests-tempest-2.7.0-0.20240409063927.bcabf13.el9.noarch\npython3-networking-l2gw-tests-tempest-0.1.1-0.20230315174804.82e3d07.el9.noarch\npython3-sahara-tempest-0.16.0-0.20230314174536.98063d3.el9.noarch\npython3-sahara-tests-tempest-0.16.0-0.20230314174536.98063d3.el9.noarch\npython3-vitrage-tests-tempest-6.2.0-0.20240131094852.816b235.el9.noarch\npython3-cinder-tests-tempest-1.15.0-0.20240924072752.645067a.el9.noarch\npython3-ironic-tests-tempest-2.11.0-0.20241002133254.fd8163d.el9.noarch\npython3-barbican-tests-tempest-4.0.0-0.20240409062212.82b0e48.el9.noarch\nopenstack-tempest-all-41.0.0-0.20250124132801.a25e0df.el9.noarch\n~ /\n2026-01-22 16:53:18.295 9 INFO tempest [-] Using tempest config file /etc/tempest/tempest.conf\e[00m\n2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: barbican_tests\e[00m\n2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: cinder_tests\e[00m\n2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: designate\e[00m\n2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: glance_tests\e[00m\n2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: heat\e[00m\n2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: ironic_tests\e[00m\n2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: keystone_tests\e[00m\n2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: kuryr_tempest_tests\e[00m\n2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: magnum_tests\e[00m\n2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: manila_tests\e[00m\n2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: mistral_test\e[00m\n2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: murano_tests\e[00m\n2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: networking_l2gw_tempest_plugin\e[00m\n2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: neutron_tests\e[00m\n2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: octavia-tempest-plugin\e[00m\n2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: sahara_tempest_tests\e[00m\n2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: telemetry_tests\e[00m\n2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: trove_tests\e[00m\n2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: vitrage_tests\e[00m\n2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: watcher_tests\e[00m\n2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-neutron-tempest-plugin\e[00m\n2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-tempest-plugin\e[00m\n2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: zaqar_tests\e[00m\n2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m\n2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m\n2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m\n2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m\n2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m\n2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m\n2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m\n2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m\n2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m\n2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m\n2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m\n2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m\n2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m\n2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m\n2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m\n2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m\n2026-01-22 16:53:18.382 9 WARNING oslo_config.generator [-] \"enabled_datastores\" is missing a help string\e[00m\n2026-01-22 16:53:18.395 9 WARNING oslo_config.generator [-] \"zabbix_alarms_per_host\" is missing a help string\e[00m\n~/openshift ~ /\n2026-01-22 16:53:19.241 13 INFO tempest [-] Using tempest config file /var/lib/tempest/openshift/etc/tempest.conf\e[00m\n2026-01-22 16:53:19.317 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: barbican_tests\e[00m\n2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: cinder_tests\e[00m\n2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: designate\e[00m\n2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: glance_tests\e[00m\n2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: heat\e[00m\n2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: ironic_tests\e[00m\n2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: keystone_tests\e[00m\n2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: kuryr_tempest_tests\e[00m\n2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: magnum_tests\e[00m\n2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: manila_tests\e[00m\n2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: mistral_test\e[00m\n2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: murano_tests\e[00m\n2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: networking_l2gw_tempest_plugin\e[00m\n2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: neutron_tests\e[00m\n2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: octavia-tempest-plugin\e[00m\n2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: sahara_tempest_tests\e[00m\n2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: telemetry_tests\e[00m\n2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: trove_tests\e[00m\n2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: vitrage_tests\e[00m\n2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: watcher_tests\e[00m\n2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-neutron-tempest-plugin\e[00m\n2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-tempest-plugin\e[00m\n2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: zaqar_tests\e[00m\n2026-01-22 16:53:19.323 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m\n2026-01-22 16:53:19.323 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m\n2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m\n2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m\n2026-01-22 16:53:19.326 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m\n2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting basic default values load_basic_defaults /usr/lib/python3.9/site-packages/config_tempest/main.py:80\e[00m\n2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] debug = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] use_stderr = false set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] log_file = tempest.log set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] username = demo_tempestconf set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] password = secrete set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] project_name = demo set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] alt_username = alt_demo_tempestconf set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] alt_password = secrete set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [identity] alt_project_name = alt_demo set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] tempest_roles = member set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_username = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_name = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [auth] admin_user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [object-storage] reseller_admin_role = ResellerAdmin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [oslo-concurrency] lock_path = /tmp set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] preserve_ports = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [network-feature-enabled] ipv6_subnet_attributes = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [scenario] dhcp_client = dhcpcd set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [image] image_path = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_username = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_name = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_password = 12345678 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [auth] admin_user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] uri = https://keystone-public-openstack.apps-crc.testing set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] region = regionOne set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] dhcp_domain = set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] vnc_console = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] v2_admin_endpoint_type = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] v3_endpoint_type = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [validation] run_validation = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [volume] catalog_type = volumev3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.356 13 DEBUG config_tempest.constants [-] Setting [identity] uri_v3 = https://keystone-public-openstack.apps-crc.testing set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.388 13 DEBUG config_tempest.constants [-] Setting [identity] disable_ssl_certificate_validation = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.388 13 DEBUG config_tempest.constants [-] Setting [identity] uri_v3 = https://keystone-public-openstack.apps-crc.testing/v3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.710 13 INFO tempest.lib.common.rest_client [req-bbd86b24-5437-425d-ae10-0806b4ea6b90 req-bbd86b24-5437-425d-ae10-0806b4ea6b90 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/auth/tokens 0.321s\e[00m\n2026-01-22 16:53:19.768 13 INFO tempest.lib.common.rest_client [req-4042fd7b-aa4f-40f0-85e7-ed950c718f11 req-4042fd7b-aa4f-40f0-85e7-ed950c718f11 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.056s\e[00m\n2026-01-22 16:53:19.769 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_id = 8d88b00a23ef40338653b967006abf05 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:19.769 13 INFO config_tempest.constants [-] Creating user 'demo_tempestconf' with project 'demo' and password 'secrete'\e[00m\n2026-01-22 16:53:20.796 13 INFO tempest.lib.common.rest_client [req-c0f26dc3-f0bd-4e54-94c3-73bd36eca433 req-c0f26dc3-f0bd-4e54-94c3-73bd36eca433 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/projects 1.023s\e[00m\n2026-01-22 16:53:20.911 13 INFO tempest.lib.common.rest_client [req-79451834-f1ee-4308-bbb2-c0df84ac9126 req-79451834-f1ee-4308-bbb2-c0df84ac9126 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.114s\e[00m\n2026-01-22 16:53:21.894 13 INFO tempest.lib.common.rest_client [req-98bc3fcb-4c9a-4f6d-9875-67da909f0180 req-98bc3fcb-4c9a-4f6d-9875-67da909f0180 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/users 0.983s\e[00m\n2026-01-22 16:53:21.894 13 INFO config_tempest.constants [-] Creating user 'alt_demo_tempestconf' with project 'alt_demo' and password 'secrete'\e[00m\n2026-01-22 16:53:21.972 13 INFO tempest.lib.common.rest_client [req-e33400e4-3bc7-436d-af30-caabc20ab7bd req-e33400e4-3bc7-436d-af30-caabc20ab7bd ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/projects 0.077s\e[00m\n2026-01-22 16:53:22.034 13 INFO tempest.lib.common.rest_client [req-08dbf2ac-f9ab-4a02-961b-577fad5a674b req-08dbf2ac-f9ab-4a02-961b-577fad5a674b ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.061s\e[00m\n2026-01-22 16:53:22.371 13 INFO tempest.lib.common.rest_client [req-6d19e5ea-fbd8-48af-936c-b1b73a0b73da req-6d19e5ea-fbd8-48af-936c-b1b73a0b73da ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/users 0.337s\e[00m\n2026-01-22 16:53:22.407 13 INFO tempest.lib.common.rest_client [req-dd3665c0-f564-4bd1-9536-3f4311d87272 req-dd3665c0-f564-4bd1-9536-3f4311d87272 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.035s\e[00m\n2026-01-22 16:53:22.441 13 INFO tempest.lib.common.rest_client [req-efda4e24-f60c-4b72-b1b7-9df886474fbc req-efda4e24-f60c-4b72-b1b7-9df886474fbc ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/users 0.033s\e[00m\n2026-01-22 16:53:22.496 13 INFO tempest.lib.common.rest_client [req-aa10264e-1800-4e2e-8091-89ae18a145f7 req-aa10264e-1800-4e2e-8091-89ae18a145f7 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/roles 0.055s\e[00m\n2026-01-22 16:53:22.550 13 INFO tempest.lib.common.rest_client [req-63793445-b3db-48de-ae48-19cf44769668 req-63793445-b3db-48de-ae48-19cf44769668 ] Request (main): 204 PUT https://keystone-public-openstack.apps-crc.testing/v3/projects/3693723342564b83802fc743f8353811/users/67f0771030f34f5ca522a5a02cbb774e/roles/9593ff08a65b46ceac5b0ac1558f2f3f 0.053s\e[00m\n2026-01-22 16:53:22.550 13 DEBUG config_tempest.constants [-] User 'admin' was given the 'admin' role in project 'demo' give_role_to_user /usr/lib/python3.9/site-packages/config_tempest/users.py:106\e[00m\n2026-01-22 16:53:22.623 13 INFO tempest.lib.common.rest_client [req-0b47e739-0aec-4d4a-a692-e6d648ab8613 req-0b47e739-0aec-4d4a-a692-e6d648ab8613 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/services 0.073s\e[00m\n2026-01-22 16:53:23.225 13 DEBUG config_tempest.constants [-] Setting [service_available] aodh = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.225 13 DEBUG config_tempest.constants [-] Setting [service_available] ironic = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.226 13 DEBUG config_tempest.constants [-] Setting [service_available] ceilometer = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.253 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] console_output = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.511 13 INFO tempest.lib.common.rest_client [req-a3d752b3-1671-4084-82f4-7678f9018d52 req-a3d752b3-1671-4084-82f4-7678f9018d52 ] Request (main): 200 GET https://nova-public-openstack.apps-crc.testing/v2.1/os-hosts 0.257s\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] min_compute_nodes = 1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] min_microversion = 2.1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] max_microversion = 2.95 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] nova = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] sahara = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] trove = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] designate = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] panko = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:23.549 13 DEBUG config_tempest.constants [-] Setting [identity] auth_version = v3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.109 13 INFO tempest.lib.common.rest_client [req-539f3d6f-47ec-4925-98df-cecf0be7ddbf req-539f3d6f-47ec-4925-98df-cecf0be7ddbf ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/info/stores 0.518s\e[00m\n2026-01-22 16:53:24.109 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] import_image = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.109 13 DEBUG config_tempest.constants [-] Setting [validation] image_ssh_user = cirros set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.611 13 INFO tempest.lib.common.rest_client [req-0e3cb39b-a0ad-4b82-92a7-c4aba6bf099f req-0e3cb39b-a0ad-4b82-92a7-c4aba6bf099f ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.501s\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [image] http_image = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [image] http_qcow2_image = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] glance = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] barbican = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] zaqar = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] gnocchi = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:24.841 13 DEBUG config_tempest.constants [-] Setting [service_available] neutron = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.211 13 INFO tempest.lib.common.rest_client [tx2716d5cfc4ef4f7d8c823-0069725605 tx2716d5cfc4ef4f7d8c823-0069725605 ] Request (main): 200 GET https://swift-public-openstack.apps-crc.testing/healthcheck 0.036s\e[00m\n2026-01-22 16:53:25.257 13 INFO tempest.lib.common.rest_client [req-14646571-1084-4fc2-9d0e-14e9f2589c61 req-14646571-1084-4fc2-9d0e-14e9f2589c61 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/roles 0.045s\e[00m\n2026-01-22 16:53:25.258 13 INFO config_tempest.constants [-] Creating ResellerAdmin role\e[00m\n2026-01-22 16:53:25.319 13 INFO tempest.lib.common.rest_client [req-90a783fd-ca3a-467e-b829-145248a8a155 req-90a783fd-ca3a-467e-b829-145248a8a155 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/roles 0.060s\e[00m\n2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [object-storage] operator_role = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] swift = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] octavia = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] heat = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [placement] min_microversion = 1.0 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [placement] max_microversion = 1.39 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [service_available] placement = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] manila = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] manila = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] ceilometer = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.581 13 DEBUG config_tempest.constants [-] Setting [volume] min_microversion = 3.0 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.581 13 DEBUG config_tempest.constants [-] Setting [volume] max_microversion = 3.70 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] cinder = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] watcher = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] mistral = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] mistral = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:26.061 13 INFO tempest.lib.common.rest_client [req-26e88561-dd79-4d43-a7c6-01c0dc841bc5 req-26e88561-dd79-4d43-a7c6-01c0dc841bc5 ] Request (main): 200 GET https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.171s\e[00m\n2026-01-22 16:53:26.063 13 DEBUG config_tempest.constants [-] Setting [volume] volume_size = 1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:26.063 13 INFO config_tempest.constants [-] Creating flavor 'm1.nano'\e[00m\n2026-01-22 16:53:26.279 13 INFO tempest.lib.common.rest_client [req-f2d1671f-d845-426f-8ddb-f95ddc7bb313 req-f2d1671f-d845-426f-8ddb-f95ddc7bb313 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.216s\e[00m\n2026-01-22 16:53:26.480 13 INFO tempest.lib.common.rest_client [req-9d7fdd7e-fd8e-4b7d-a965-4eba5745ac48 req-9d7fdd7e-fd8e-4b7d-a965-4eba5745ac48 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2/os-extra_specs 0.198s\e[00m\n2026-01-22 16:53:26.482 13 DEBUG config_tempest.constants [-] Setting [compute] flavor_ref = 8d1ce660-7497-440b-8666-00c695d0b4d2 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:26.482 13 INFO config_tempest.constants [-] Creating flavor 'm1.micro'\e[00m\n2026-01-22 16:53:26.555 13 INFO tempest.lib.common.rest_client [req-ec4338c5-2f6f-4c35-ab33-4dc4572c7180 req-ec4338c5-2f6f-4c35-ab33-4dc4572c7180 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.073s\e[00m\n2026-01-22 16:53:26.637 13 INFO tempest.lib.common.rest_client [req-9902fef3-bc9c-4875-9465-35b1f336be35 req-9902fef3-bc9c-4875-9465-35b1f336be35 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors/c36c4338-67fc-4ac7-9a68-89ed828dd90b/os-extra_specs 0.078s\e[00m\n2026-01-22 16:53:26.639 13 DEBUG config_tempest.constants [-] Setting [compute] flavor_ref_alt = c36c4338-67fc-4ac7-9a68-89ed828dd90b set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:27.217 13 INFO tempest.lib.common.rest_client [req-e518f2e7-50bf-424e-b9af-884f93676b01 req-e518f2e7-50bf-424e-b9af-884f93676b01 ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.578s\e[00m\n2026-01-22 16:53:27.218 13 INFO config_tempest.constants [-] Creating image 'cirros-0.6.2-x86_64-disk.img'\e[00m\n2026-01-22 16:53:27.218 13 INFO config_tempest.constants [-] Downloading 'https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img' and saving as '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m\n2026-01-22 16:53:27.683 13 INFO config_tempest.constants [-] Uploading image 'cirros-0.6.2-x86_64-disk.img' from '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m\n2026-01-22 16:53:28.186 13 INFO tempest.lib.common.rest_client [req-5d94114a-9da1-4961-84dd-ab351751780d req-5d94114a-9da1-4961-84dd-ab351751780d ] Request (main): 201 POST https://glance-default-public-openstack.apps-crc.testing/v2/images 0.502s\e[00m\n2026-01-22 16:53:29.458 13 INFO tempest.lib.common.rest_client [req-c01f91ba-f79d-4b85-b080-84572f3ec74d req-c01f91ba-f79d-4b85-b080-84572f3ec74d ] Request (main): 204 PUT https://glance-default-public-openstack.apps-crc.testing/v2/images/e1b65bbe-5c14-4552-a5d9-d275c9dd42d3/file 1.271s\e[00m\n2026-01-22 16:53:29.522 13 INFO tempest.lib.common.rest_client [req-3e340864-eed3-4055-b078-2836a990c73c req-3e340864-eed3-4055-b078-2836a990c73c ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.064s\e[00m\n2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Creating image 'cirros-0.6.2-x86_64-disk.img_alt'\e[00m\n2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Image 'https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img' already fetched to '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'.\e[00m\n2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Uploading image 'cirros-0.6.2-x86_64-disk.img_alt' from '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m\n2026-01-22 16:53:29.650 13 INFO tempest.lib.common.rest_client [req-c97af78d-2299-418f-8eef-4db191378910 req-c97af78d-2299-418f-8eef-4db191378910 ] Request (main): 201 POST https://glance-default-public-openstack.apps-crc.testing/v2/images 0.127s\e[00m\n2026-01-22 16:53:30.698 13 INFO tempest.lib.common.rest_client [req-cf88e403-6c67-4338-b823-b182193aca7c req-cf88e403-6c67-4338-b823-b182193aca7c ] Request (main): 204 PUT https://glance-default-public-openstack.apps-crc.testing/v2/images/a33c2bad-821b-43f1-aa77-518d2843bb18/file 1.048s\e[00m\n2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [scenario] img_file = /var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [compute] image_ref = e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [compute] image_ref_alt = a33c2bad-821b-43f1-aa77-518d2843bb18 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:30.699 13 INFO config_tempest.constants [-] Setting up network\e[00m\n2026-01-22 16:53:30.699 13 INFO config_tempest.constants [-] No network supplied, trying auto discover for an external network while prioritizing the one called public, if not found, the network discovered last will be used.\e[00m\n2026-01-22 16:53:31.347 13 INFO tempest.lib.common.rest_client [req-6b91ca0f-9aae-46d3-8d50-b7a76dacfdd6 req-6b91ca0f-9aae-46d3-8d50-b7a76dacfdd6 ] Request (main): 200 GET https://neutron-public-openstack.apps-crc.testing/v2.0/networks 0.648s\e[00m\n2026-01-22 16:53:31.348 13 INFO config_tempest.constants [-] Setting 9663874c-fdcf-40bd-bc5f-873b4ba46792 as the public network for tempest\e[00m\n2026-01-22 16:53:31.348 13 DEBUG config_tempest.constants [-] Setting [network] public_network_id = 9663874c-fdcf-40bd-bc5f-873b4ba46792 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.348 13 DEBUG config_tempest.constants [-] Setting [network] floating_network_name = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.416 13 INFO tempest.lib.common.rest_client [req-b4bebe47-26d2-4136-b6bb-76786a557bb7 req-b4bebe47-26d2-4136-b6bb-76786a557bb7 ] Request (main): 200 GET https://cinder-public-openstack.apps-crc.testing/v3/os-services?binary=cinder-backup 0.068s\e[00m\n2026-01-22 16:53:31.418 13 DEBUG config_tempest.constants [-] Setting [volume-feature-enabled] backup = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.436 13 DEBUG config_tempest.constants [-] Setting [service_available] horizon = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_v2 = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_v3 = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] api_v1 = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] api_v2 = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_extensions = OS-OAUTH2,OS-ENDPOINT-POLICY,OS-EC2,OS-SIMPLE-CERT,OS-PKI,OS-REVOKE,s3tokens,OS-FEDERATION,OS-INHERIT,OS-TRUST,OS-EP-FILTER,OS-OAUTH1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [network-feature-enabled] api_extensions = address-group,address-scope,agent,allowed-address-pairs,auto-allocated-topology,availability_zone,default-subnetpools,dhcp_agent_scheduler,dns-integration,dns-domain-ports,dns-integration-domain-keywords,expose-port-forwarding-in-fip,external-net,extra_dhcp_opt,extraroute,filter-validation,floating-ip-port-forwarding-description,floating-ip-port-forwarding-detail,floating-ip-port-forwarding-port-ranges,fip-port-details,flavors,floating-ip-port-forwarding,floatingip-pools,ip_allocation,l2_adjacency,router,ext-gw-mode,logging,multi-provider,net-mtu,net-mtu-writable,network_availability_zone,network-ip-availability,pagination,port-device-profile,port-mac-address-regenerate,port-numa-affinity-policy,port-resource-request,port-resource-request-groups,binding,binding-extended,port-security,project-id,provider,qos,qos-bw-limit-direction,qos-bw-minimum-ingress,qos-default,qos-fip,qos-gateway-ip,qos-port-network-policy,qos-pps-minimum,qos-pps-minimum-rule-alias,qos-pps,qos-rule-type-details,qos-rule-type-filter,qos-rules-alias,quota-check-limit,quotas,quota_details,rbac-policies,rbac-address-scope,rbac-security-groups,revision-if-match,standard-attr-revisions,router_availability_zone,security-groups-normalized-cidr,security-groups-remote-address-group,security-groups-shared-filtering,security-group,segment,segments-peer-subnet-host-routes,service-type,sorting,standard-attr-segment,standard-attr-description,stateful-security-group,subnet-dns-publish-fixed-ip,subnet-segmentid-writable,subnet-service-types,subnet_allocation,subnetpool-prefix-ops,standard-attr-tag,standard-attr-timestamp,trunk,trunk-details set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [object-storage-feature-enabled] discoverable_apis = symlink,versioned_writes,slo,account_quotas,container_quotas,staticweb,s3api,formpost,ratelimit,tempurl,bulk_upload,bulk_delete set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [volume-feature-enabled] api_extensions = os-admin-actions,os-availability-zone,backups,capabilities,cgsnapshots,consistencygroups,os-extended-services,os-extended-snapshot-attributes,os-hosts,qos-specs,os-quota-class-sets,os-quota-sets,OS-SCH-HNT,scheduler-stats,os-services,os-snapshot-actions,os-snapshot-manage,os-snapshot-unmanage,os-types-extra-specs,os-types-manage,os-used-limits,os-volume-actions,os-volume-encryption-metadata,os-vol-host-attr,os-vol-image-meta,os-volume-manage,os-vol-mig-status-attr,os-vol-tenant-attr,os-volume-transfer,os-volume-type-access,encryption,os-volume-unmanage set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m\n2026-01-22 16:53:31.462 13 INFO config_tempest.constants [-] Creating configuration file /var/lib/tempest/openshift/etc/tempest.conf\e[00m\n{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_list_supported_logging_types [0.479288s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_log_deleted_with_corresponding_security_group [1.452320s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_log_lifecycle [1.823059s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_extension_driver_port_security_admin.PortSecurityAdminTests.test_create_port_security_false_on_shared_network [4.686974s] ... ok\n{3} setUpClass (neutron_tempest_plugin.api.admin.test_agent_availability_zone.AgentAvailabilityZoneTestCase) ... SKIPPED: availability_zone supported agent not found.\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_delete_agent_negative [0.153653s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_list_agent [0.264714s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_list_agents_non_admin [0.577192s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_show_agent [0.200018s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_invalid_resource_type [0.477196s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_nonexistent_port [0.204151s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_nonexistent_sg [0.071226s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_update_agent_description [0.413646s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_update_agent_status [0.198925s] ... ok\n{1} setUpClass (neutron_tempest_plugin.api.admin.test_ports.PortTestCasesResourceRequest) ... SKIPPED: Skipped as provider VLANs are not available in config\n{3} setUpClass (neutron_tempest_plugin.api.admin.test_default_security_group_rules.DefaultSecurityGroupRuleTest) ... SKIPPED: security-groups-default-rules extension not enabled.\n{3} setUpClass (neutron_tempest_plugin.api.admin.test_l3_agent_scheduler.L3AgentSchedulerTestJSON) ... SKIPPED: l3_agent_scheduler extension not enabled.\n{1} neutron_tempest_plugin.api.admin.test_shared_network_extension.AllowedAddressPairSharedNetworkTest.test_create_with_address_pair_blocked_on_other_network [0.689643s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_ports.PortTestCasesAdmin.test_regenerate_mac_address [1.583062s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_shared_network_extension.AllowedAddressPairSharedNetworkTest.test_update_with_address_pair_blocked_on_other_network [1.413274s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_delete_policies_while_tenant_attached_to_net [10.148618s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_ports.PortTestCasesAdmin.test_update_mac_address [1.614354s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_one_policy_delete [4.053436s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_policy_create [2.500601s] ... ok\n{3} setUpClass (neutron_tempest_plugin.api.admin.test_routers_flavors.RoutersFlavorTestCase) ... SKIPPED: l3-flavors extension not enabled.\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_policy_delete [2.225582s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_network_on_shared_policy_delete [2.347570s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterQosPolicyTestJSON.test_filter_qos_policy_tags [2.552652s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_update_policy_from_wildcard_to_specific_tenant [5.921747s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_policy_allows_tenant_to_allocate_floatingip [4.590408s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterFloatingIpTestJSON.test_filter_floatingip_tags [1.239771s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_policy_allows_tenant_to_attach_ext_gw [6.120847s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterSecGroupTestJSON.test_filter_security_group_tags [1.103136s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_regular_client_blocked_from_creating_external_wild_policies [1.276411s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_regular_client_shares_with_another [2.852103s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_wildcard_policy_created_from_external_network_api [4.921274s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_wildcard_policy_delete_blocked_on_default_ext [1.313892s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterSubnetpoolTestJSON.test_filter_subnetpool_tags [1.073496s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterNetworkTestJSON.test_filter_network_tags [1.689970s] ... ok\n{0} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops [51.919366s] ... ok\n{0} setUpClass (neutron_tempest_plugin.api.admin.test_network_segment_range.NetworkSegmentRangeTestJson) ... SKIPPED: network-segment-range extension not enabled.\n{1} neutron_tempest_plugin.api.admin.test_tag.TagPortTestJSON.test_port_tags [2.683534s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project [1.897819s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project_and_other_tenant [0.124799s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project_and_tenant [0.870146s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_tenant_network_vxlan ... SKIPPED: VXLAN type_driver is not enabled\n{2} neutron_tempest_plugin.api.admin.test_floating_ips_admin_actions.FloatingIPAdminTestJSON.test_associate_floating_ip_with_port_from_another_project [4.551834s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_floating_ips_admin_actions.FloatingIPAdminTestJSON.test_create_floatingip_with_specified_ip_address [4.462521s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_tag.TagRouterTestJSON.test_router_tags [5.735888s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterTrunkTestJSON.test_filter_trunk_tags [1.490591s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_floatingip_when_quotas_is_full [2.503035s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_network_when_quotas_is_full [2.959785s] ... ok\n{1} neutron_tempest_plugin.api.admin.test_tag.UpdateTagsTest.test_update_tags_affects_only_updated_resource [4.300004s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_port_when_quotas_is_full [6.381304s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_router_when_quotas_is_full [2.302207s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_security_group_rule_when_quotas_is_full [2.358021s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_quotas.QuotasTest.test_detail_quotas [4.271578s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_quotas.QuotasTest.test_quotas [1.299900s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_security_group_when_quotas_is_full [1.792787s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_tag.TagQosPolicyTestJSON.test_qos_policy_tags [2.972602s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_subnet_when_quotas_is_full [4.551952s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_security_groups.SecGroupAdminTest.test_security_group_recreated_on_port_update [7.610181s] ... ok\n{3} neutron_tempest_plugin.api.admin.test_tag.TagTrunkTestJSON.test_trunk_tags [4.409673s] ... ok\n{0} setUpClass (neutron_tempest_plugin.api.admin.test_routers_ha.RoutersTestHA) ... SKIPPED: l3-ha extension not enabled.\n{1} neutron_tempest_plugin.api.test_auto_allocated_topology.TestAutoAllocatedTopology.test_delete_allocated_net_topology_as_tenant [23.676270s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_delete_address_scope_associated_with_subnetpool [2.185197s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_delete_non_existent_address_scope [0.102985s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_get_non_existent_address_scope [0.102643s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_create_shared_address_scope [0.078539s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_create_rbac_policy_with_target_tenant_none [5.283354s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_get_not_shared_admin_address_scope [0.542851s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_update_address_scope_shared_false [0.238029s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_update_address_scope_shared_true [0.268173s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_update_non_existent_address_scope [0.108735s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_tag.TagSecGroupTestJSON.test_security_group_tags [2.561537s] ... ok\n{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_update_shared_address_scope_to_unshare [0.239790s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_create_rbac_policy_with_target_tenant_too_long_id [4.151625s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_delete_self_share_rule [2.529939s] ... ok\n{1} neutron_tempest_plugin.api.test_auto_allocated_topology.TestAutoAllocatedTopology.test_get_allocated_net_topology_as_tenant [13.724974s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_create_update_delete_flavor [0.594934s] ... ok\n{2} neutron_tempest_plugin.api.admin.test_tag.TagSubnetPoolTestJSON.test_subnetpool_tags [3.147804s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_create_update_delete_service_profile [0.810369s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_list_flavors [0.328330s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_list_service_profiles [0.060740s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_show_flavor [0.168775s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_duplicate_policy_error [5.171543s] ... ok\n{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_show_service_profile [0.104093s] ... ok\n{2} setUpClass (neutron_tempest_plugin.api.test_address_groups.RbacSharedAddressGroupTest) ... SKIPPED: rbac-address-group extension not enabled.\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filter_fields [2.912524s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filter_policies [1.940854s] ... ok\n{1} setUpClass (neutron_tempest_plugin.api.test_conntrack_helper.ConntrackHelperTestJSON) ... SKIPPED: l3-conntrack-helper extension not enabled.\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_admin_create_shared_address_scope [1.811667s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_admin_update_shared_address_scope [0.528687s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_delete_address_scope [0.710185s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_show_address_scope [0.458182s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_show_address_scope_project_id [0.210792s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_tenant_create_list_address_scope [0.424569s] ... ok\n{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_tenant_update_address_scope [0.593759s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filtering_works_with_rbac_records_present [5.826728s] ... ok\n{2} setUpClass (neutron_tempest_plugin.api.test_metering_extensions.MeteringIpV6TestJSON) ... SKIPPED: metering extension not enabled.\n{2} setUpClass (neutron_tempest_plugin.api.test_metering_negative.MeteringNegativeTestJSON) ... SKIPPED: metering extension not enabled.\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_network_only_visible_to_policy_target [3.931753s] ... ok\n{1} neutron_tempest_plugin.api.test_floating_ips.FloatingIPPoolTestJSON.test_create_floatingip_from_specific_pool [7.174111s] ... ok\n{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_blank_update_clears_association [2.820457s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_policy_show [4.711084s] ... ok\n{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_create_update_floatingip_description [4.832600s] ... ok\n{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_after_port_delete [5.632098s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_policy_target_update [4.468503s] ... ok\n{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_after_subnet_and_ports [3.751487s] ... ok\n{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_create_update_floatingip_port_details [4.172069s] ... ok\n{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_before_subnet [0.889783s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_after_port_delete [4.330884s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_port_presence_prevents_network_rbac_policy_deletion [6.608564s] ... ok\n{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_floatingip_update_extra_attributes_port_id_not_changed [3.460644s] ... ok\n{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_show_ip_availability_after_port_delete [3.169123s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_after_subnet_and_ports [2.015763s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_before_subnet [0.948350s] ... ok\n{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_show_ip_availability_after_subnet_and_ports_create [2.543879s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ipv6_ip_availability_after_subnet_and_ports [2.114246s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_rbac_bumps_network_revision [5.175137s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_show_ip_availability_after_port_delete [2.872928s] ... ok\n{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_show_ip_availability_after_subnet_and_ports_create [2.154625s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_rbac_policy_quota [5.613359s] ... ok\n{3} setUpClass (neutron_tempest_plugin.api.test_local_ip.LocalIPAssociationTestJSON) ... SKIPPED: local_ip extension not enabled.\n{3} setUpClass (neutron_tempest_plugin.api.test_local_ip.LocalIPTestJSON) ... SKIPPED: local_ip extension not enabled.\n{3} setUpClass (neutron_tempest_plugin.api.test_ndp_proxy.NDPProxyTestJSON) ... SKIPPED: l3-ndp-proxy extension not enabled.\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_blocked_from_sharing_anothers_network [4.413670s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_illegal_ip [0.105194s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_network_id [0.114777s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_security_groups_id [0.153464s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_tenant_id [0.086532s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_too_long_description [0.093438s] ... ok\n{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_too_long_name [0.076172s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_blocked_from_sharing_with_wildcard [2.259728s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_shares_to_another_regular_client [1.647954s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_no_pagination_limit_0 [0.172839s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination [0.898385s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.399728s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.514045s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.350247s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_subnet_on_network_only_visible_to_policy_target [3.792578s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_create_policy [1.022506s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_attach_and_detach_a_policy_by_a_tenant ... SKIPPED: Creation of shared resources should be allowed,\n setting the create_shared_resources option as 'True' is needed\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_default_policy_creating_network_with_policy [3.696015s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_with_href_links [4.279758s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_tenant_can_delete_port_on_own_network [5.077489s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_get_rules_by_policy [4.757349s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_with_marker [1.953801s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_sorts_asc [0.158171s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_sorts_desc [0.852153s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create [1.846129s] ... ok\n{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_validation_filters [0.210388s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_tenant_cant_delete_other_tenants_ports [2.457686s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_default_policy_creating_network_without_policy [3.838242s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_fail_for_the_same_type [0.925686s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.157129s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.130695s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_not_allowed_if_policy_in_use_by_network [0.977860s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_delete [1.516228s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update [1.169838s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_update_self_share_rule [4.114812s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update_forbidden_for_regular_tenants_foreign_policy [0.783567s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_not_allowed_if_policy_in_use_by_port [3.269851s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_policy [0.487114s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update_forbidden_for_regular_tenants_own_policy [0.760475s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_get_policy_that_is_shared [0.294097s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_admin_rule_types [0.080477s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_policy_filter_by_name [1.044553s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_policy_sort_by_name [1.125442s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_regular_rule_types [0.113008s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_admin_network [2.489080s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_network_non_shared_policy [0.413875s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_network_nonexistent_policy [0.316380s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_non_shared_policy [1.604576s] ... ok\n{3} neutron_tempest_plugin.api.test_networks_negative.NetworksNegativeTest.test_delete_network_in_use [1.526792s] ... ok\n{3} neutron_tempest_plugin.api.test_networks_negative.NetworksNegativeTest.test_update_network_mtu [0.359368s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_nonexistent_policy [0.888316s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_attach_and_detach_a_policy_by_a_tenant ... SKIPPED: Creation of shared resources should be allowed,\n setting the create_shared_resources option as 'True' is needed\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_shared_policy [2.016138s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_tenant_network [2.065159s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_create_policy_with_multiple_rules [2.728317s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_create_forbidden_for_regular_tenants [0.118396s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update [0.376464s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_get_rules_by_policy [1.290158s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create [1.003631s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_association_with_admin_network [1.988622s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_fail_for_the_same_type [0.622834s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_forbidden_for_regular_tenants [0.124058s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_rule_nonexistent_policy [0.138091s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_delete [1.119572s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_association_with_port_shared_policy [2.277442s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_forbidden_for_regular_tenants_foreign_policy [0.235575s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update [0.917897s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_forbidden_for_regular_tenants_own_policy [0.472488s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_bulk_shared_network [2.547282s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_qos_policy_delete_with_rules [0.570282s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_1 [0.812254s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_shared_policy_update [0.714555s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_2 [1.046047s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_policy_has_project_id [0.526008s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_rule_type_details_as_admin [0.097554s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_rule_type_details_as_user [0.100763s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_port_shared_network_as_non_admin_tenant [2.068579s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_forbidden_for_regular_tenants_foreign_policy [0.670657s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_forbidden_for_regular_tenants_own_policy [0.549154s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_update_shared_network [1.411960s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_user_create_port_with_admin_qos_policy [1.912578s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_mapping_different_external_ports_to_the_same_destination [2.782070s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_filtering_shared_networks [1.540699s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_mapping_same_fip_and_external_port_to_different_dest [2.736586s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_out_of_range_ports [2.756117s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_filtering_shared_subnets [5.189565s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_list_shared_networks [0.956732s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_show_shared_networks_attribute [0.333569s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_no_pagination_limit_0 [0.662606s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination [1.112544s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.282426s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.330405s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.451963s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_with_href_links [2.307660s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_with_marker [1.356914s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_sorts_asc [0.520452s] ... ok\n{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_sorts_desc [1.604479s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_filter_fields [1.926834s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.461581s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.858768s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.245830s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_filter_rbac_policies [2.490463s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterPortTestJSON.test_filter_port_tags [5.811020s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_net_bound_shared_policy_wildcard_and_project_id_wild_remove [7.546834s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_net_bound_shared_policy_wildcard_and_projectid_wild_remains [4.456532s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_description [0.448426s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_name [0.080204s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_tenant_id [0.106243s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_2_port_forwardings_to_floating_ip [7.367977s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_delete_non_existent_qos_policy [0.967944s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_get_non_existent_qos_policy [0.183500s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_non_existent_qos_policy [0.112371s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_policy_with_too_long_description [0.271208s] ... ok\n{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_policy_with_too_long_name [0.209588s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_network_presence_prevents_policy_rbac_policy_deletion [4.603639s] ... ok\n{1} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTestToCentralized) ... SKIPPED: dvr extension not enabled.\n{1} setUpClass (neutron_tempest_plugin.api.test_routers.HaRoutersTest) ... SKIPPED: l3-ha extension not enabled.\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_2_fixed_ips [4.991159s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_sharing_with_wildcard [3.400072s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_port_with_fip [2.914703s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_sharing_with_wildcard_and_project_id [2.470462s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_target_update [0.851857s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_rbac_policy_show [1.197836s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_used_floating_ip [3.814490s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterRouterTestJSON.test_filter_router_tags [1.939613s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_regular_client_blocked_from_sharing_anothers_policy [1.080434s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_router_with_default_snat_value [5.072216s] ... ok\n{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_regular_client_shares_to_another_regular_client [1.126698s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_info_in_fip_details [7.066597s] ... ok\n{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_life_cycle [5.191294s] ... ok\n{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.525829s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_router_with_snat_explicit [11.750769s] ... ok\n{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.461035s] ... ok\n{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.467391s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_update_router_description [1.761838s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_extra_routes_atomic ... SKIPPED: Skipped because network extension: extraroute-atomic is not enabled\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_portbinding_bumps_revision [6.601580s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_allowed_address_pairs_bumps_revision [3.984981s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_network_attached_with_two_routers [18.022642s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_dns_domain_bumps_revision [4.356534s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterSubnetTestJSON.test_filter_subnet_tags [1.601336s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_router_interface_status [5.908452s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_external_network_bumps_revision [2.003038s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.169012s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination [1.060484s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_extra_dhcp_opt_bumps_revision [4.675054s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.440410s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.653336s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.628723s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_with_href_links [2.329151s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_router_interface_update_and_remove_gateway_ip [9.087182s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_with_marker [1.725382s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_sorts_asc [0.223099s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_sorts_desc [0.202794s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagFloatingIpTestJSON.test_floatingip_tags [2.750031s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_floatingip_bumps_revision [13.259553s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_extra_route [9.014233s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_network_bumps_revision [2.280040s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_network_constrained_by_revision [2.051329s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_add_ips_to_port [3.621467s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_bumps_revision [3.230490s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_reset_gateway_without_snat [7.723832s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagNetworkTestJSON.test_network_tags [2.935376s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_change_dhcp_flag_then_create_port [4.251096s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_description [2.085041s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_security_bumps_revisions [5.718283s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_security [1.663191s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_set_gateway_with_snat_explicit [6.671504s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_sg_binding_bumps_revision [6.121588s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_dns_domain [6.447065s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_qos_network_policy_binding_bumps_revision [3.190510s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_dns_name [2.815781s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_set_gateway_without_snat [9.597167s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_no_dns_name [2.326521s] ... ok\n{0} neutron_tempest_plugin.api.admin.test_tag.TagSubnetTestJSON.test_subnet_tags [2.973892s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_qos_port_policy_binding_bumps_revision [3.429237s] ... ok\n{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_port_shut_down ... SKIPPED: At least one DHCP agent is required to be running in the environment for this test.\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_bumps_revision [8.760727s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_extra_attributes_bumps_revision ... SKIPPED: Skipped because network extension: l3-ha is not enabled\n{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_add_wrong_address_to_address_group [2.402755s] ... ok\n{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_address_group_create_with_wrong_address [0.219410s] ... ok\n{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_address_group_lifecycle [0.778784s] ... ok\n{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_edit_addresses_in_address_group [0.876919s] ... ok\n{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_remove_wrong_address_from_address_group [1.616347s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_router_with_default_snat_value [6.503243s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_extra_routes_bumps_revision [9.675252s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_get_rules_by_policy [1.837383s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_sg_group_bumps_revision [0.736822s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create [1.552827s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_fail_for_missing_min_kbps [0.130022s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_sg_rule_bumps_sg_revision [1.916058s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_fail_for_the_same_type [0.719677s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.271059s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_filter_fields [1.914653s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_pass_for_direction_ingress [0.502831s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.101398s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_delete [0.963674s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_filter_rbac_policies [1.119058s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_policy_target_update [0.538627s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_update [1.255874s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_bumps_network_revision [3.560233s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_rbac_policy_show [2.044851s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_regular_client_blocked_from_sharing_anothers_policy [0.523255s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_regular_client_shares_to_another_regular_client [1.571893s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_router_with_snat_explicit [10.975996s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_bumps_revision [4.212800s] ... ok\n{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_subnet_pool_presence_prevents_rbac_policy_deletion [1.816350s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_update_router_description [1.543181s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_extra_routes_atomic ... SKIPPED: Skipped because network extension: extraroute-atomic is not enabled\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_service_types_bumps_revisions [4.217279s] ... ok\n{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnetpool_bumps_revision [0.750047s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_get_rules_by_policy [2.311268s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create [0.944467s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_any_direction_when_egress_direction_exists [0.717572s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_egress_direction_when_any_direction_exists [1.034693s] ... ok\n{0} neutron_tempest_plugin.api.test_availability_zones.ListAvailableZonesTest.test_list_available_zones [0.497800s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_fail_for_missing_min_kpps [0.280078s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_fail_for_the_same_type [0.976303s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.713278s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_delete [1.514824s] ... ok\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_update [1.268139s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_network_attached_with_two_routers [14.520443s] ... ok\n{2} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTest) ... SKIPPED: dvr extension not enabled.\n{2} setUpClass (neutron_tempest_plugin.api.test_routers_negative.HaRoutersNegativeTest) ... SKIPPED: l3-ha extension not enabled.\n{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_update_direction_conflict [1.536655s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_allowed_address_pairs [4.477666s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_create_port_sec_with_security_group [6.647612s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_router_interface_status [10.136300s] ... ok\n{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [1.169822s] ... ok\n{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.457727s] ... ok\n{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.322501s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_delete_with_port_sec [4.922496s] ... ok\n{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_in_use [4.835873s] ... ok\n{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_port_nonexist [0.245110s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_default_value [1.866282s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_specific_value_1 [4.418949s] ... ok\n{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_wrong_tenant [6.480649s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_specific_value_2 [2.663435s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_router_interface_update_and_remove_gateway_ip [15.605425s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_update_pass [4.418553s] ... ok\n{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_remove_associated_ports [12.121457s] ... ok\n{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_update_port_failed [6.517988s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_extra_route [11.304971s] ... ok\n{3} neutron_tempest_plugin.api.test_routers.RoutersDeleteTest.test_delete_router [17.238146s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_reset_gateway_without_snat [10.983228s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_set_gateway_with_snat_explicit [17.875504s] ... ok\n{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_includes_all [0.528774s] ... ok\n{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_pagination [0.156379s] ... ok\n{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_project_id [0.084432s] ... ok\n{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_sorting [0.069959s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_filter_fields [4.639863s] ... ok\n{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_set_gateway_without_snat [10.300219s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_filter_rbac_policies [1.106386s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_policy_target_update [0.881881s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_create_update_delete_flavor [0.585025s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_create_update_delete_service_profile [0.421364s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_list_flavors [0.098676s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_list_service_profiles [0.073018s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_show_flavor [0.133334s] ... ok\n{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_show_service_profile [0.125014s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_port_presence_prevents_policy_rbac_policy_deletion [2.809717s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_create_sg_rules_when_quota_disabled [13.863572s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_rbac_policy_show [1.102096s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_regular_client_blocked_from_sharing_anothers_policy [1.475566s] ... ok\n{1} setUpClass (neutron_tempest_plugin.api.test_routers_negative.DvrRoutersNegativeTest) ... SKIPPED: dvr extension not enabled.\n{1} setUpClass (neutron_tempest_plugin.api.test_routers_negative.DvrRoutersNegativeTestExtended) ... SKIPPED: dvr extension not enabled.\n{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_regular_client_shares_to_another_regular_client [2.555682s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_decrease_less_than_created [7.546671s] ... ok\n{1} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupRulesQuotaTest.test_sg_creation_with_insufficient_sg_rules_quota [1.206775s] ... ok\n{1} neutron_tempest_plugin.api.test_service_type_management.ServiceTypeManagementTest.test_service_provider_list [0.457842s] ... ok\n{1} setUpClass (neutron_tempest_plugin.api.test_subnetpools.RbacSubnetPoolTest) ... SKIPPED: rbac-subnetpool extension not enabled.\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupNormalizedCidrTest.test_normalized_cidr_in_rule [1.846059s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.116002s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination [0.586737s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.355545s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.216798s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.429008s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_with_href_links [1.618602s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_with_marker [0.656340s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_sorts_asc [0.099070s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_sorts_desc [0.120912s] ... ok\n{0} neutron_tempest_plugin.api.test_floating_ips_negative.FloatingIPNegativeTestJSON.test_associate_floatingip_with_port_with_floatingip [12.073557s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_validation_filters [0.160927s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_increased [18.821029s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_values [1.488369s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_create_sg_when_quota_disabled [10.149166s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_admin_create_shared_subnetpool [1.011119s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_list_subnetpool [0.360832s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_default_prefixlen [2.250085s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolIPv6Test.test_security_group_rule_protocol_legacy_icmpv6 [2.366803s] ... ok\n{0} setUpClass (neutron_tempest_plugin.api.test_metering_extensions.MeteringTestJSON) ... SKIPPED: metering extension not enabled.\n{0} setUpClass (neutron_tempest_plugin.api.test_ndp_proxy_negative.NDPProxyNegativeTestJSON) ... SKIPPED: l3-ndp-proxy extension not enabled.\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_prefixlen [2.890566s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_quota [2.385769s] ... ok\n{0} neutron_tempest_plugin.api.test_network_ip_availability_negative.NetworksIpAvailabilityNegativeTest.test_network_availability_nonexistent_network_id [1.071365s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_max_allowed_sg_amount [9.223481s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_subnet_cidr [3.959933s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnetpool_associate_address_scope [0.495488s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_update_subnetpool_description [0.956725s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_get_subnetpool [0.389932s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_show_subnetpool_has_project_id [0.389678s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_tenant_create_non_default_subnetpool [0.316511s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_tenant_update_subnetpool [0.375629s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_associate_address_scope [0.603394s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksMtuTestJSON.test_create_network_custom_mtu [1.564202s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_associate_another_address_scope [0.859904s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_disassociate_address_scope [0.638586s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksMtuTestJSON.test_update_network_custom_mtu [1.234435s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_prefixes_append [0.345434s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_decrease_less_than_created [8.530250s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolTest.test_security_group_rule_protocol_ints [9.900287s] ... ok\n{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_prefixes_extend [0.421089s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_create_network_with_project [1.168753s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolTest.test_security_group_rule_protocol_names [7.962857s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_create_update_network_dns_domain [1.492245s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_list_networks_fields_keystone_v3 [0.902678s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_show_network [0.245780s] ... ok\n{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_show_network_fields_keystone_v3 [0.895733s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolIPv6Test.test_security_group_rule_protocol_legacy_icmpv6 [2.503980s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_increased [16.012126s] ... ok\n{1} neutron_tempest_plugin.api.test_subnets.SubnetServiceTypeTestJSON.test_allocate_ips_are_from_correct_subnet [3.561559s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_values [2.483802s] ... ok\n{0} neutron_tempest_plugin.api.test_ports.PortsIpv6TestJSON.test_add_ipv6_ips_to_port [1.939455s] ... ok\n{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_bulk_creation ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled\n{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_bulk_creation_no_tags ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled\n{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_creation ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled\n{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolTest.test_security_group_rule_protocol_ints [8.725653s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_bulk_sec_groups [1.520815s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_list_update_show_delete_security_group [0.914927s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_add_subports [7.229157s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_get_rules_by_policy [2.041917s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_invalid_rule_create [0.189792s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_sec_groups_with_the_same_name [2.613985s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create [0.669689s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolTest.test_security_group_rule_protocol_names [8.242699s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_fail_for_the_same_type [0.421599s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.066699s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.093933s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_delete [1.564835s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_show_delete_trunk [3.676733s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_list_security_group_rules_contains_all_rules [2.519428s] ... ok\n{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_update [0.809947s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_show_security_group_contains_all_rules [0.957291s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_stateless_security_group_update [2.001341s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_trunk_empty_subports_list [3.446350s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupQuotaTest.test_create_excess_sg [1.883090s] ... ok\n{3} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupQuotaTest.test_sg_quota_incorrect_values [1.866960s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_trunk_subports_not_specified [5.127519s] ... ok\n{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.636088s] ... ok\n{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [1.448824s] ... ok\n{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.353136s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_update_trunk [6.095812s] ... ok\n{3} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv4Test.test_add_overlapping_prefix [0.849248s] ... ok\n{3} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv4Test.test_add_remove_prefix [0.653313s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_assign_nonexistent_sec_group [1.801077s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_update_trunk_with_description [4.359475s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_assign_sec_group_twice [2.561247s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_create_security_group_with_boolean_type_name [0.176918s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_create_security_group_with_too_long_name [0.049159s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_delete_in_use_sec_group [1.139535s] ... ok\n{0} setUpClass (neutron_tempest_plugin.api.test_router_interface_fip.RouterInterfaceFip) ... SKIPPED: Skipped because network extension: router-interface-fip is not enabled\n{0} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTestUpdateDistributedExtended) ... SKIPPED: dvr extension not enabled.\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_no_sec_group_changes_after_assignment_failure [1.913982s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_default_security_group_name [0.528390s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_security_group_with_boolean_type_name [0.427186s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_security_group_with_too_long_name [0.346642s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_delete_trunk_with_subport_is_allowed [6.266959s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_network_with_timestamp [2.070363s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_no_pagination_limit_0 [0.187413s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination [0.904493s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_port_with_timestamp [1.871069s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.245023s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.331584s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.617031s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_subnet_with_timestamp [2.802274s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_subnetpool_with_timestamp [1.111835s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_with_href_links [4.778374s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_segment_with_timestamp [2.577400s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_get_subports [9.679438s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_with_marker [1.170398s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_sorts_asc [0.135874s] ... ok\n{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_sorts_desc [0.147998s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_networks_attribute_with_timestamp [0.942313s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_port_attribute_with_timestamp [1.361725s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_assign_nonexistent_sec_group [3.788244s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_subnet_attribute_with_timestamp [4.909637s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_subnetpool_attribute_with_timestamp [0.374425s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_assign_sec_group_twice [2.266525s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_create_security_group_with_boolean_type_name [0.113535s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_create_security_group_with_too_long_name [0.065795s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_network_with_timestamp [1.739238s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_list_trunks [9.502239s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_delete_in_use_sec_group [1.437100s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_port_with_timestamp [1.722476s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_no_sec_group_changes_after_assignment_failure [1.818715s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_default_security_group_name [0.603958s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_security_group_with_boolean_type_name [0.384920s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_security_group_with_too_long_name [0.293602s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_subnet_with_timestamp [2.775865s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_subnetpool_with_timestamp [0.377209s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_no_pagination_limit_0 [0.174312s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_pagination [0.904920s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_sorts_by_name_asc [0.190485s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_sorts_by_name_desc [0.232952s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_remove_subport [10.731028s] ... ok\n{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_show_trunk_has_project_id [4.056402s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_adding_stateful_sg_to_port_with_stateless_sg [1.080163s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_bulk_sec_groups [1.959328s] ... ok\n{1} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationTests) ... SKIPPED: DNSIntegrationTests skipped as designate is not available\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_adding_stateless_sg_to_port_with_stateful_sg [1.047018s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_list_update_show_delete_security_group [1.127145s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_create_port_with_stateful_and_stateless_sg [0.289251s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_update_used_stateful_sg_to_stateless [0.884506s] ... ok\n{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_update_used_stateless_sg_to_stateful [0.867683s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_sec_groups_with_the_same_name [2.016971s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_create_sg_with_timestamp [0.936924s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_create_sgrule_with_timestamp [0.797873s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_show_sg_attribute_with_timestamp [0.351629s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_list_security_group_rules_contains_all_rules [2.276316s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_show_sgrule_attribute_with_timestamp [0.743161s] ... ok\n{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_show_security_group_contains_all_rules [1.067496s] ... ok\n{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_update_sg_with_timestamp [2.706891s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv6Test.test_add_overlapping_prefix [0.771568s] ... ok\n{3} setUpClass (neutron_tempest_plugin.api.test_trunk.TrunkTestMtusJSON) ... SKIPPED: Either vxlan or vlan type driver not enabled.\n{2} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv6Test.test_add_remove_prefix [1.381416s] ... ok\n{3} setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON) [0.000000s] ... FAILED\n\nCaptured traceback:\n~~~~~~~~~~~~~~~~~~~\n Traceback (most recent call last):\n\n File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 185, in setUpClass\n raise value.with_traceback(trace)\n\n File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 170, in setUpClass\n cls.setup_credentials()\n\n File \"/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py\", line 117, in setup_credentials\n super(BaseNetworkTest, cls).setup_credentials()\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 398, in setup_credentials\n \ manager = cls.get_client_manager(\n\n File \"/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py\", line 89, in get_client_manager\n manager = super(BaseNetworkTest, cls).get_client_manager(\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 743, in get_client_manager\n \ creds = getattr(cred_provider, credentials_method)()\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 473, in get_primary_creds\n return self.get_project_member_creds()\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 514, in get_project_member_creds\n return self.get_credentials(['member'], scope='project')\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 436, in get_credentials\n credentials = self._create_creds(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 200, in _create_creds\n project = self.creds_client.create_project(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/cred_client.py\", line 164, in create_project\n project = self.projects_client.create_project(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/services/identity/v3/projects_client.py\", line 37, in create_project\n resp, body = self.post('projects', post_body)\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 314, in post\n resp_header, resp_body = self.request(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 762, in request\n self._error_checker(resp, resp_body)\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 946, in _error_checker\n raise exceptions.UnexpectedResponseCode(str(resp.status),\n\n \ tempest.lib.exceptions.UnexpectedResponseCode: Unexpected response code received\nDetails: 504\n\n{3} setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestMtusJSON) ... SKIPPED: Either vxlan or vlan type driver not enabled.\n{3} setUpClass (neutron_tempest_plugin.scenario.test_dvr.NetworkDvrTest) ... SKIPPED: Skipped because network extension: dvr is not enabled\n{0} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupProtocolTest.test_create_security_group_rule_with_ipv6_protocol_integers [2.543355s] ... ok\n{3} setUpClass (neutron_tempest_plugin.scenario.test_fip64.Fip64) ... SKIPPED: Skipped because network extension: fip64 is not enabled\n{0} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupProtocolTest.test_create_security_group_rule_with_ipv6_protocol_names [1.470036s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_admin_create_shared_subnetpool [1.618370s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_dual_stack_subnets_from_subnetpools [4.784466s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_list_subnetpool [0.485529s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_sp_associate_address_scope_multiple_prefix_intersect [1.151299s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_default_prefixlen [4.934364s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnet_different_pools_same_network [5.321652s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_address_scope_of_other_owner [0.244969s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_address_scope_prefix_intersect [0.960678s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_invalid_address_scope [0.047961s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_non_exist_address_scope [0.235721s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_delete_non_existent_subnetpool [0.061448s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_prefixlen [2.229584s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_get_non_existent_subnetpool [0.078766s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_default_subnetpool [0.080447s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_shared_subnetpool [0.082199s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_subnetpool_associate_shared_address_scope ... SKIPPED: Test is outdated starting from Ussuri release.\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_get_not_shared_admin_subnetpool [0.442570s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_quota [2.782988s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_update_sp_prefix_associated_with_shared_addr_scope [2.185374s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_non_existent_subnetpool [0.090400s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_associate_address_scope_of_other_owner [0.372359s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_associate_address_scope_wrong_ip_version [0.414117s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_subnet_cidr [1.997458s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_multiple_prefix_intersect [1.233021s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_not_modifiable_shared [0.170542s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnetpool_associate_address_scope [0.364110s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_update_subnetpool_description [0.487714s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_get_subnetpool [0.281642s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_show_subnetpool_has_project_id [0.202570s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_prefix_intersect [1.010235s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_tenant_create_non_default_subnetpool [0.096367s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_prefixes_shrink [0.317787s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_tenant_update_subnetpool [0.424902s] ... ok\n{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_tenant_id [0.199502s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_associate_address_scope [1.060711s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_associate_another_address_scope [0.922087s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_disassociate_address_scope [1.168384s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_prefixes_append [0.365154s] ... ok\n{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_prefixes_extend [0.317641s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.280983s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination [0.859780s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.382789s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.261022s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.561104s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_create_floatingip_with_timestamp [1.839366s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_create_router_with_timestamp [0.228321s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_show_floatingip_attribute_with_timestamp [0.921789s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_show_router_attribute_with_timestamp [0.409341s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_with_href_links [1.992014s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_update_floatingip_with_timestamp [0.838071s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_with_marker [1.194689s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_sorts_asc [0.120766s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_sorts_desc [0.131685s] ... ok\n{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_validation_filters [0.149506s] ... ok\n{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_update_router_with_timestamp [2.867470s] ... ok\n{0} setUpClass (neutron_tempest_plugin.api.test_trunk.TrunkTestInheritJSONBase) ... SKIPPED: VLAN type_driver is not enabled\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_no_pagination_limit_0 [0.081539s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination [0.410396s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.630073s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.155979s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.620196s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_with_href_links [1.608244s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_with_marker [1.111932s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_sorts_asc [0.095744s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_sorts_desc [0.090578s] ... ok\n{1} neutron_tempest_plugin.scenario.test_floatingip.DefaultSnatToExternal.test_snat_external_ip [83.958680s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_empty_trunk_details [4.348573s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_trunk_details_no_subports [5.479316s] ... ok\n{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_trunk_details_with_subport [9.985204s] ... ok\n{3} neutron_tempest_plugin.scenario.test_floatingip.FloatingIPPortDetailsTest.test_floatingip_port_details [117.465087s] ... ok\n{0} neutron_tempest_plugin.scenario.admin.test_floatingip.FloatingIpTestCasesAdmin.test_two_vms_fips [82.426212s] ... ok\n{3} setUpClass (neutron_tempest_plugin.scenario.test_local_ip.LocalIPTest) ... SKIPPED: Skipped because network extension: local_ip is not enabled\n{0} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationAdminTests) ... SKIPPED: DNSIntegrationAdminTests skipped as designate is not available\n{0} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationExtraTests) ... SKIPPED: DNSIntegrationExtraTests skipped as designate is not available\n{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_1 [83.416657s] ... ok\n{2} neutron_tempest_plugin.scenario.test_basic.NetworkBasicTest.test_basic_instance [82.653480s] ... ok\n{2} neutron_tempest_plugin.scenario.test_basic.NetworkBasicTest.test_ping_global_ip_from_vm_with_fip ... SKIPPED: Global IP address is not defined.\n{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_dvr_and_no_dvr_routers_in_same_subnet ... SKIPPED: Skipped because network extension: dvr is not enabled\n{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_2 [89.051591s] ... ok\n{0} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpMultipleRoutersTest.test_reuse_ip_address_with_other_fip_on_other_router [153.557620s] ... ok\n{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_attach_previously_used_port_to_new_instance [163.189172s] ... ok\n{0} setUpClass (neutron_tempest_plugin.scenario.test_mac_learning.MacLearningTest) ... SKIPPED: This test requires advanced tools to be executed\n{0} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromDVR) ... SKIPPED: Skipped because network extension: dvr is not enabled\n{0} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromHA) ... SKIPPED: Skipped because network extension: dvr is not enabled\n{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_router_east_west_traffic [120.918807s] ... ok\n{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_3 [123.110325s] ... ok\n{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_create_instance_using_network_with_existing_policy [99.126227s] ... ok\n{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_through_2_routers [99.471004s] ... ok\n{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_4 [79.073514s] ... ok\n{2} neutron_tempest_plugin.scenario.test_dhcp.DHCPPortUpdateTest.test_modify_dhcp_port_ip_address ... SKIPPED: OVN driver is required to run this test - LP#1942794 solution only applied to OVN\n{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_qos_basic_and_update [118.375390s] ... ok\n{2} neutron_tempest_plugin.scenario.test_dhcp.DHCPTest.test_extra_dhcp_opts [68.980035s] ... ok\n{2} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationDomainPerProjectTests) ... SKIPPED: DNSIntegrationDomainPerProjectTests skipped as designate is not available\n{1} neutron_tempest_plugin.scenario.test_ipv6.IPv6Test.test_ipv6_hotplug_dhcpv6stateless [80.165040s] ... ok\n{1} neutron_tempest_plugin.scenario.test_ipv6.IPv6Test.test_ipv6_hotplug_slaac [79.753879s] ... ok\n{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIPQosTest.test_qos [99.672187s] ... ok\n{1} neutron_tempest_plugin.scenario.test_metadata.MetadataTest.test_metadata_routed ... SKIPPED: Advanced image is required to run this test.\n{1} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromDVRHA) ... SKIPPED: Skipped because network extension: dvr is not enabled\n{1} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromLegacy) ... SKIPPED: Skipped because network extension: dvr is not enabled\n{1} setUpClass (neutron_tempest_plugin.scenario.test_mtu.NetworkMtuTest) ... SKIPPED: GRE or VXLAN type_driver is not enabled\n{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_1 [89.453921s] ... ok\n{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_editing_and_deleting_tcp_rule [102.578268s] ... ok\n{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_2 [92.066310s] ... ok\n{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_editing_and_deleting_udp_rule [105.564427s] ... ok\n{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_3 [140.558136s] ... ok\n{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_to_2_fixed_ips [107.668195s] ... ok\n{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_4 [74.839647s] ... ok\n{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_to_2_servers [151.075935s] ... ok\n{2} neutron_tempest_plugin.scenario.test_floatingip.TestFloatingIPUpdate.test_floating_ip_update [98.221068s] ... ok\n{1} neutron_tempest_plugin.scenario.test_ports.PortsTest.test_port_with_fixed_ip [46.551135s] ... ok\n{2} neutron_tempest_plugin.scenario.test_internal_dns.InternalDNSTest.test_create_and_update_port_with_dns_name [101.263655s] ... ok\n{1} neutron_tempest_plugin.scenario.test_ports.PortsTest.test_previously_used_port [119.581036s] ... ok\n{2} neutron_tempest_plugin.scenario.test_internal_dns.InternalDNSTest.test_dns_domain_and_name [177.965972s] ... ok\n{2} setUpClass (neutron_tempest_plugin.scenario.test_multicast.MulticastTestIPv4) ... SKIPPED: This test require advanced tools for this test\n{1} neutron_tempest_plugin.scenario.test_security_groups.StatelessSecGroupDualStackDHCPv6StatelessTest.test_default_sec_grp_scenarios [187.850287s] ... ok\n{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_parent_port_connectivity_after_trunk_deleted_lb ... SKIPPED: Advanced image is required to run this test.\n{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_subport_connectivity ... SKIPPED: Advanced image is required to run this test.\n{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_subport_connectivity_soft_reboot ... SKIPPED: Advanced image is required to run this test.\n{2} neutron_tempest_plugin.scenario.test_portsecurity.PortSecurityTest.test_port_security_removed_added_stateful_sg [90.152993s] ... ok\n{2} neutron_tempest_plugin.scenario.test_portsecurity.PortSecurityTest.test_port_security_removed_added_stateless_sg [98.417694s] ... ok\n{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_trunk_subport_lifecycle [172.978105s] ... ok\n{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_trunk_vm_migration ... SKIPPED: Less than 2 compute nodes, skipping multinode tests.\n{1} setUpClass (neutron_tempest_plugin.scenario.test_vlan_transparency.VlanTransparencyTest) ... SKIPPED: vlan-transparent extension not enabled.\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_connectivity_between_vms_using_different_sec_groups [151.800634s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_default_sec_grp_scenarios [157.572479s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_fragmented_traffic_is_accepted ... SKIPPED: Advanced image is required to run this test.\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_ip_prefix [90.172081s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_ip_prefix_negative [140.796251s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_multiple_ports_portrange_remote [232.085843s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_multiple_ports_secgroup_inheritance [76.417326s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_overlapping_sec_grp_rules [250.677933s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_packets_of_any_connection_state_can_reach_dest ... SKIPPED: Advanced image is required to run this test.\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_protocol_number_rule [112.017830s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_reattach_sg_with_changed_mode [44.339067s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remote_group [141.707074s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remote_group_and_remote_address_group ... SKIPPED: Openvswitch agent is required to run this test\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remove_sec_grp_from_active_vm [110.616211s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_two_sec_groups [44.400442s] ... ok\n{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessSecGroupDualStackSlaacTest.test_default_sec_grp_scenarios [152.197966s] ... ok\n\n==============================\nFailed 1 tests - output below:\n==============================\n\nsetUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON)\n-------------------------------------------------------------------------\n\nCaptured traceback:\n~~~~~~~~~~~~~~~~~~~\n Traceback (most recent call last):\n\n File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 185, in setUpClass\n \ raise value.with_traceback(trace)\n\n File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 170, in setUpClass\n cls.setup_credentials()\n\n File \"/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py\", line 117, in setup_credentials\n super(BaseNetworkTest, cls).setup_credentials()\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 398, in setup_credentials\n \ manager = cls.get_client_manager(\n\n File \"/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py\", line 89, in get_client_manager\n manager = super(BaseNetworkTest, cls).get_client_manager(\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/test.py\", line 743, in get_client_manager\n \ creds = getattr(cred_provider, credentials_method)()\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 473, in get_primary_creds\n return self.get_project_member_creds()\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 514, in get_project_member_creds\n return self.get_credentials(['member'], scope='project')\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 436, in get_credentials\n credentials = self._create_creds(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py\", line 200, in _create_creds\n project = self.creds_client.create_project(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/cred_client.py\", line 164, in create_project\n project = self.projects_client.create_project(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/services/identity/v3/projects_client.py\", line 37, in create_project\n resp, body = self.post('projects', post_body)\n\n \ File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 314, in post\n resp_header, resp_body = self.request(\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 762, in request\n self._error_checker(resp, resp_body)\n\n File \"/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py\", line 946, in _error_checker\n raise exceptions.UnexpectedResponseCode(str(resp.status),\n\n \ tempest.lib.exceptions.UnexpectedResponseCode: Unexpected response code received\nDetails: 504\n\n\n\n======\nTotals\n======\nRan: 696 tests in 4415.2383 sec.\n - Passed: 630\n - Skipped: 65\n - Expected Fail: 0\n - Unexpected Success: 0\n - Failed: 1\nSum of execute time for each test: 6939.5956 sec.\n\n==============\nWorker Balance\n==============\n - Worker 0 (165 tests) => 0:18:13.217142\n - Worker 1 (161 tests) => 0:44:37.632479\n - Worker 2 (228 tests) => 1:13:35.024311\n - Worker 3 (142 tests) => 0:21:48.839706\n~ /\n/\nExcluded tests\nIncluded tests\n~/openshift /\nGenerate file containing failing tests\nGenerate subunit, then xml and html results\nsetUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON)\n/\n" stdout_lines: - python3-tempest-41.0.0-0.20250124132801.a25e0df.el9.noarch - python3-tempestconf-3.5.3-0.20250819134715.8515371.el9.noarch - openstack-tempest-41.0.0-0.20250124132801.a25e0df.el9.noarch - python3-watcher-tests-tempest-3.0.0-0.20240131100157.92ca984.el9.noarch - python3-designate-tests-tempest-0.22.0-0.20240409063647.347fdbc.el9.noarch - python3-manila-tests-tempest-2.4.0-0.20240730171324.d9530e0.el9.noarch - python3-keystone-tests-tempest-0.16.0-0.20240528071825.63cfcb9.el9.noarch - python3-whitebox-tests-tempest-0.0.3-0.20240412161827.766ff04.el9.noarch - python3-murano-tests-tempest-2.7.0-0.20240131092708.d2b794c.el9.noarch - python3-trove-tests-tempest-2.2.0-0.20240131093157.d63e17a.el9.noarch - python3-mistral-tests-tempest-2.2.0-0.20240131094539.2f92367.el9.noarch - python3-kuryr-tests-tempest-0.15.1-0.20240131095631.ab45b2f.el9.noarch - python3-whitebox-neutron-tests-tempest-0.9.2-0.20251111185731.12cf06c.el9.noarch - python3-zaqar-tests-tempest-1.7.0-0.20240131094344.3813c99.el9.noarch - python3-magnum-tests-tempest-2.1.0-0.20240131093411.ef90336.el9.noarch - python3-octavia-tests-tempest-golang-2.6.0-0.20240409063333.a1a2bed.el9.x86_64 - python3-octavia-tests-tempest-2.6.0-0.20240409063333.a1a2bed.el9.noarch - python3-glance-tests-tempest-0.7.0-0.20240131091807.d6f7287.el9.noarch - python3-heat-tests-tempest-2.1.0-0.20240409061406.5a48492.el9.noarch - python3-telemetry-tests-tempest-2.5.1-0.20250603080835.ddfb79a.el9.noarch - python3-neutron-tests-tempest-2.7.0-0.20240409063927.bcabf13.el9.noarch - python3-networking-l2gw-tests-tempest-0.1.1-0.20230315174804.82e3d07.el9.noarch - python3-sahara-tempest-0.16.0-0.20230314174536.98063d3.el9.noarch - python3-sahara-tests-tempest-0.16.0-0.20230314174536.98063d3.el9.noarch - python3-vitrage-tests-tempest-6.2.0-0.20240131094852.816b235.el9.noarch - python3-cinder-tests-tempest-1.15.0-0.20240924072752.645067a.el9.noarch - python3-ironic-tests-tempest-2.11.0-0.20241002133254.fd8163d.el9.noarch - python3-barbican-tests-tempest-4.0.0-0.20240409062212.82b0e48.el9.noarch - openstack-tempest-all-41.0.0-0.20250124132801.a25e0df.el9.noarch - ~ / - "2026-01-22 16:53:18.295 9 INFO tempest [-] Using tempest config file /etc/tempest/tempest.conf\e[00m" - "2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: barbican_tests\e[00m" - "2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: cinder_tests\e[00m" - "2026-01-22 16:53:18.339 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: designate\e[00m" - "2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: glance_tests\e[00m" - "2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: heat\e[00m" - "2026-01-22 16:53:18.340 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: ironic_tests\e[00m" - "2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: keystone_tests\e[00m" - "2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: kuryr_tempest_tests\e[00m" - "2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: magnum_tests\e[00m" - "2026-01-22 16:53:18.341 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: manila_tests\e[00m" - "2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: mistral_test\e[00m" - "2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: murano_tests\e[00m" - "2026-01-22 16:53:18.342 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: networking_l2gw_tempest_plugin\e[00m" - "2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: neutron_tests\e[00m" - "2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: octavia-tempest-plugin\e[00m" - "2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: sahara_tempest_tests\e[00m" - "2026-01-22 16:53:18.343 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: telemetry_tests\e[00m" - "2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: trove_tests\e[00m" - "2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: vitrage_tests\e[00m" - "2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: watcher_tests\e[00m" - "2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-neutron-tempest-plugin\e[00m" - "2026-01-22 16:53:18.344 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-tempest-plugin\e[00m" - "2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: zaqar_tests\e[00m" - "2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m" - "2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m" - "2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m" - "2026-01-22 16:53:18.345 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m" - "2026-01-22 16:53:18.346 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m" - "2026-01-22 16:53:18.347 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m" - "2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m" - "2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m" - "2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m" - "2026-01-22 16:53:18.348 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m" - "2026-01-22 16:53:18.370 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m" - "2026-01-22 16:53:18.371 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m" - "2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m" - "2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m" - "2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m" - "2026-01-22 16:53:18.372 9 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m" - "2026-01-22 16:53:18.382 9 WARNING oslo_config.generator [-] \"enabled_datastores\" is missing a help string\e[00m" - "2026-01-22 16:53:18.395 9 WARNING oslo_config.generator [-] \"zabbix_alarms_per_host\" is missing a help string\e[00m" - ~/openshift ~ / - "2026-01-22 16:53:19.241 13 INFO tempest [-] Using tempest config file /var/lib/tempest/openshift/etc/tempest.conf\e[00m" - "2026-01-22 16:53:19.317 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: barbican_tests\e[00m" - "2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: cinder_tests\e[00m" - "2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: designate\e[00m" - "2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: glance_tests\e[00m" - "2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: heat\e[00m" - "2026-01-22 16:53:19.318 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: ironic_tests\e[00m" - "2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: keystone_tests\e[00m" - "2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: kuryr_tempest_tests\e[00m" - "2026-01-22 16:53:19.319 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: magnum_tests\e[00m" - "2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: manila_tests\e[00m" - "2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: mistral_test\e[00m" - "2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: murano_tests\e[00m" - "2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: networking_l2gw_tempest_plugin\e[00m" - "2026-01-22 16:53:19.320 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: neutron_tests\e[00m" - "2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: octavia-tempest-plugin\e[00m" - "2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: sahara_tempest_tests\e[00m" - "2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: telemetry_tests\e[00m" - "2026-01-22 16:53:19.321 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: trove_tests\e[00m" - "2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: vitrage_tests\e[00m" - "2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: watcher_tests\e[00m" - "2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-neutron-tempest-plugin\e[00m" - "2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: whitebox-tempest-plugin\e[00m" - "2026-01-22 16:53:19.322 13 INFO tempest.test_discover.plugins [-] Register additional config options from Tempest plugin: zaqar_tests\e[00m" - "2026-01-22 16:53:19.323 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: barbican_tests\e[00m" - "2026-01-22 16:53:19.323 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: cinder_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: designate\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: glance_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: heat\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: ironic_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: keystone_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: kuryr_tempest_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: magnum_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: manila_tests\e[00m" - "2026-01-22 16:53:19.324 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: mistral_test\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: murano_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: networking_l2gw_tempest_plugin\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: neutron_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: octavia-tempest-plugin\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: sahara_tempest_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: telemetry_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: trove_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: vitrage_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: watcher_tests\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-neutron-tempest-plugin\e[00m" - "2026-01-22 16:53:19.325 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: whitebox-tempest-plugin\e[00m" - "2026-01-22 16:53:19.326 13 INFO tempest.test_discover.plugins [-] List additional config options registered by Tempest plugin: zaqar_tests\e[00m" - "2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting basic default values load_basic_defaults /usr/lib/python3.9/site-packages/config_tempest/main.py:80\e[00m" - "2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] debug = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] use_stderr = false set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.350 13 DEBUG config_tempest.constants [-] Setting [DEFAULT] log_file = tempest.log set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] username = demo_tempestconf set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] password = se**********te set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] project_name = demo set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] alt_username = alt_demo_tempestconf set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.351 13 DEBUG config_tempest.constants [-] Setting [identity] alt_password = se**********te set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [identity] alt_project_name = alt_demo set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] tempest_roles = member set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_username = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_name = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.352 13 DEBUG config_tempest.constants [-] Setting [auth] admin_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [auth] admin_user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [object-storage] reseller_admin_role = ResellerAdmin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [oslo-concurrency] lock_path = /tmp set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.353 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] preserve_ports = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [network-feature-enabled] ipv6_subnet_attributes = true set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [scenario] dhcp_client = dhcpcd set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [image] image_path = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_username = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_name = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_password = 12345678 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.354 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [auth] admin_user_domain_name = Default set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] uri = https://keystone-public-openstack.apps-crc.testing set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] region = regionOne set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] dhcp_domain = set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] vnc_console = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] v2_admin_endpoint_type = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [identity] v3_endpoint_type = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [validation] run_validation = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.355 13 DEBUG config_tempest.constants [-] Setting [volume] catalog_type = volumev3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.356 13 DEBUG config_tempest.constants [-] Setting [identity] uri_v3 = https://keystone-public-openstack.apps-crc.testing set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.388 13 DEBUG config_tempest.constants [-] Setting [identity] disable_ssl_certificate_validation = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.388 13 DEBUG config_tempest.constants [-] Setting [identity] uri_v3 = https://keystone-public-openstack.apps-crc.testing/v3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.710 13 INFO tempest.lib.common.rest_client [req-bbd86b24-5437-425d-ae10-0806b4ea6b90 req-bbd86b24-5437-425d-ae10-0806b4ea6b90 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/auth/tokens 0.321s\e[00m" - "2026-01-22 16:53:19.768 13 INFO tempest.lib.common.rest_client [req-4042fd7b-aa4f-40f0-85e7-ed950c718f11 req-4042fd7b-aa4f-40f0-85e7-ed950c718f11 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.056s\e[00m" - "2026-01-22 16:53:19.769 13 DEBUG config_tempest.constants [-] Setting [auth] admin_project_id = 8d88b00a23ef40338653b967006abf05 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:19.769 13 INFO config_tempest.constants [-] Creating user 'demo_tempestconf' with project 'demo' and password 'secrete'\e[00m" - "2026-01-22 16:53:20.796 13 INFO tempest.lib.common.rest_client [req-c0f26dc3-f0bd-4e54-94c3-73bd36eca433 req-c0f26dc3-f0bd-4e54-94c3-73bd36eca433 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/projects 1.023s\e[00m" - "2026-01-22 16:53:20.911 13 INFO tempest.lib.common.rest_client [req-79451834-f1ee-4308-bbb2-c0df84ac9126 req-79451834-f1ee-4308-bbb2-c0df84ac9126 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.114s\e[00m" - "2026-01-22 16:53:21.894 13 INFO tempest.lib.common.rest_client [req-98bc3fcb-4c9a-4f6d-9875-67da909f0180 req-98bc3fcb-4c9a-4f6d-9875-67da909f0180 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/users 0.983s\e[00m" - "2026-01-22 16:53:21.894 13 INFO config_tempest.constants [-] Creating user 'alt_demo_tempestconf' with project 'alt_demo' and password 'secrete'\e[00m" - "2026-01-22 16:53:21.972 13 INFO tempest.lib.common.rest_client [req-e33400e4-3bc7-436d-af30-caabc20ab7bd req-e33400e4-3bc7-436d-af30-caabc20ab7bd ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/projects 0.077s\e[00m" - "2026-01-22 16:53:22.034 13 INFO tempest.lib.common.rest_client [req-08dbf2ac-f9ab-4a02-961b-577fad5a674b req-08dbf2ac-f9ab-4a02-961b-577fad5a674b ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.061s\e[00m" - "2026-01-22 16:53:22.371 13 INFO tempest.lib.common.rest_client [req-6d19e5ea-fbd8-48af-936c-b1b73a0b73da req-6d19e5ea-fbd8-48af-936c-b1b73a0b73da ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/users 0.337s\e[00m" - "2026-01-22 16:53:22.407 13 INFO tempest.lib.common.rest_client [req-dd3665c0-f564-4bd1-9536-3f4311d87272 req-dd3665c0-f564-4bd1-9536-3f4311d87272 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/projects 0.035s\e[00m" - "2026-01-22 16:53:22.441 13 INFO tempest.lib.common.rest_client [req-efda4e24-f60c-4b72-b1b7-9df886474fbc req-efda4e24-f60c-4b72-b1b7-9df886474fbc ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/users 0.033s\e[00m" - "2026-01-22 16:53:22.496 13 INFO tempest.lib.common.rest_client [req-aa10264e-1800-4e2e-8091-89ae18a145f7 req-aa10264e-1800-4e2e-8091-89ae18a145f7 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/roles 0.055s\e[00m" - "2026-01-22 16:53:22.550 13 INFO tempest.lib.common.rest_client [req-63793445-b3db-48de-ae48-19cf44769668 req-63793445-b3db-48de-ae48-19cf44769668 ] Request (main): 204 PUT https://keystone-public-openstack.apps-crc.testing/v3/projects/3693723342564b83802fc743f8353811/users/67f0771030f34f5ca522a5a02cbb774e/roles/9593ff08a65b46ceac5b0ac1558f2f3f 0.053s\e[00m" - "2026-01-22 16:53:22.550 13 DEBUG config_tempest.constants [-] User 'admin' was given the 'admin' role in project 'demo' give_role_to_user /usr/lib/python3.9/site-packages/config_tempest/users.py:106\e[00m" - "2026-01-22 16:53:22.623 13 INFO tempest.lib.common.rest_client [req-0b47e739-0aec-4d4a-a692-e6d648ab8613 req-0b47e739-0aec-4d4a-a692-e6d648ab8613 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/services 0.073s\e[00m" - "2026-01-22 16:53:23.225 13 DEBUG config_tempest.constants [-] Setting [service_available] aodh = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.225 13 DEBUG config_tempest.constants [-] Setting [service_available] ironic = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.226 13 DEBUG config_tempest.constants [-] Setting [service_available] ceilometer = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.253 13 DEBUG config_tempest.constants [-] Setting [compute-feature-enabled] console_output = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.511 13 INFO tempest.lib.common.rest_client [req-a3d752b3-1671-4084-82f4-7678f9018d52 req-a3d752b3-1671-4084-82f4-7678f9018d52 ] Request (main): 200 GET https://nova-public-openstack.apps-crc.testing/v2.1/os-hosts 0.257s\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] min_compute_nodes = 1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] min_microversion = 2.1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [compute] max_microversion = 2.95 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] nova = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] sahara = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] trove = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] designate = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.512 13 DEBUG config_tempest.constants [-] Setting [service_available] panko = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:23.549 13 DEBUG config_tempest.constants [-] Setting [identity] auth_version = v3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.109 13 INFO tempest.lib.common.rest_client [req-539f3d6f-47ec-4925-98df-cecf0be7ddbf req-539f3d6f-47ec-4925-98df-cecf0be7ddbf ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/info/stores 0.518s\e[00m" - "2026-01-22 16:53:24.109 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] import_image = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.109 13 DEBUG config_tempest.constants [-] Setting [validation] image_ssh_user = cirros set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.611 13 INFO tempest.lib.common.rest_client [req-0e3cb39b-a0ad-4b82-92a7-c4aba6bf099f req-0e3cb39b-a0ad-4b82-92a7-c4aba6bf099f ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.501s\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [image] http_image = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [image] http_qcow2_image = https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] glance = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] barbican = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] zaqar = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.612 13 DEBUG config_tempest.constants [-] Setting [service_available] gnocchi = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:24.841 13 DEBUG config_tempest.constants [-] Setting [service_available] neutron = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.211 13 INFO tempest.lib.common.rest_client [tx2716d5cfc4ef4f7d8c823-0069725605 tx2716d5cfc4ef4f7d8c823-0069725605 ] Request (main): 200 GET https://swift-public-openstack.apps-crc.testing/healthcheck 0.036s\e[00m" - "2026-01-22 16:53:25.257 13 INFO tempest.lib.common.rest_client [req-14646571-1084-4fc2-9d0e-14e9f2589c61 req-14646571-1084-4fc2-9d0e-14e9f2589c61 ] Request (main): 200 GET https://keystone-public-openstack.apps-crc.testing/v3/roles 0.045s\e[00m" - "2026-01-22 16:53:25.258 13 INFO config_tempest.constants [-] Creating ResellerAdmin role\e[00m" - "2026-01-22 16:53:25.319 13 INFO tempest.lib.common.rest_client [req-90a783fd-ca3a-467e-b829-145248a8a155 req-90a783fd-ca3a-467e-b829-145248a8a155 ] Request (main): 201 POST https://keystone-public-openstack.apps-crc.testing/v3/roles 0.060s\e[00m" - "2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [object-storage] operator_role = admin set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] swift = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] octavia = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.319 13 DEBUG config_tempest.constants [-] Setting [service_available] heat = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [placement] min_microversion = 1.0 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [placement] max_microversion = 1.39 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.350 13 DEBUG config_tempest.constants [-] Setting [service_available] placement = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] manila = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] manila = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.351 13 DEBUG config_tempest.constants [-] Setting [service_available] ceilometer = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.581 13 DEBUG config_tempest.constants [-] Setting [volume] min_microversion = 3.0 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.581 13 DEBUG config_tempest.constants [-] Setting [volume] max_microversion = 3.70 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] cinder = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] watcher = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] mistral = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:25.889 13 DEBUG config_tempest.constants [-] Setting [service_available] mistral = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:26.061 13 INFO tempest.lib.common.rest_client [req-26e88561-dd79-4d43-a7c6-01c0dc841bc5 req-26e88561-dd79-4d43-a7c6-01c0dc841bc5 ] Request (main): 200 GET https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.171s\e[00m" - "2026-01-22 16:53:26.063 13 DEBUG config_tempest.constants [-] Setting [volume] volume_size = 1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:26.063 13 INFO config_tempest.constants [-] Creating flavor 'm1.nano'\e[00m" - "2026-01-22 16:53:26.279 13 INFO tempest.lib.common.rest_client [req-f2d1671f-d845-426f-8ddb-f95ddc7bb313 req-f2d1671f-d845-426f-8ddb-f95ddc7bb313 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.216s\e[00m" - "2026-01-22 16:53:26.480 13 INFO tempest.lib.common.rest_client [req-9d7fdd7e-fd8e-4b7d-a965-4eba5745ac48 req-9d7fdd7e-fd8e-4b7d-a965-4eba5745ac48 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors/8d1ce660-7497-440b-8666-00c695d0b4d2/os-extra_specs 0.198s\e[00m" - "2026-01-22 16:53:26.482 13 DEBUG config_tempest.constants [-] Setting [compute] flavor_ref = 8d1ce660-7497-440b-8666-00c695d0b4d2 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:26.482 13 INFO config_tempest.constants [-] Creating flavor 'm1.micro'\e[00m" - "2026-01-22 16:53:26.555 13 INFO tempest.lib.common.rest_client [req-ec4338c5-2f6f-4c35-ab33-4dc4572c7180 req-ec4338c5-2f6f-4c35-ab33-4dc4572c7180 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors 0.073s\e[00m" - "2026-01-22 16:53:26.637 13 INFO tempest.lib.common.rest_client [req-9902fef3-bc9c-4875-9465-35b1f336be35 req-9902fef3-bc9c-4875-9465-35b1f336be35 ] Request (main): 200 POST https://nova-public-openstack.apps-crc.testing/v2.1/flavors/c36c4338-67fc-4ac7-9a68-89ed828dd90b/os-extra_specs 0.078s\e[00m" - "2026-01-22 16:53:26.639 13 DEBUG config_tempest.constants [-] Setting [compute] flavor_ref_alt = c36c4338-67fc-4ac7-9a68-89ed828dd90b set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:27.217 13 INFO tempest.lib.common.rest_client [req-e518f2e7-50bf-424e-b9af-884f93676b01 req-e518f2e7-50bf-424e-b9af-884f93676b01 ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.578s\e[00m" - "2026-01-22 16:53:27.218 13 INFO config_tempest.constants [-] Creating image 'cirros-0.6.2-x86_64-disk.img'\e[00m" - "2026-01-22 16:53:27.218 13 INFO config_tempest.constants [-] Downloading 'https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img' and saving as '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m" - "2026-01-22 16:53:27.683 13 INFO config_tempest.constants [-] Uploading image 'cirros-0.6.2-x86_64-disk.img' from '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m" - "2026-01-22 16:53:28.186 13 INFO tempest.lib.common.rest_client [req-5d94114a-9da1-4961-84dd-ab351751780d req-5d94114a-9da1-4961-84dd-ab351751780d ] Request (main): 201 POST https://glance-default-public-openstack.apps-crc.testing/v2/images 0.502s\e[00m" - "2026-01-22 16:53:29.458 13 INFO tempest.lib.common.rest_client [req-c01f91ba-f79d-4b85-b080-84572f3ec74d req-c01f91ba-f79d-4b85-b080-84572f3ec74d ] Request (main): 204 PUT https://glance-default-public-openstack.apps-crc.testing/v2/images/e1b65bbe-5c14-4552-a5d9-d275c9dd42d3/file 1.271s\e[00m" - "2026-01-22 16:53:29.522 13 INFO tempest.lib.common.rest_client [req-3e340864-eed3-4055-b078-2836a990c73c req-3e340864-eed3-4055-b078-2836a990c73c ] Request (main): 200 GET https://glance-default-public-openstack.apps-crc.testing/v2/images 0.064s\e[00m" - "2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Creating image 'cirros-0.6.2-x86_64-disk.img_alt'\e[00m" - "2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Image 'https://download.cirros-cloud.net/0.6.2/cirros-0.6.2-x86_64-disk.img' already fetched to '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'.\e[00m" - "2026-01-22 16:53:29.523 13 INFO config_tempest.constants [-] Uploading image 'cirros-0.6.2-x86_64-disk.img_alt' from '/var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img'\e[00m" - "2026-01-22 16:53:29.650 13 INFO tempest.lib.common.rest_client [req-c97af78d-2299-418f-8eef-4db191378910 req-c97af78d-2299-418f-8eef-4db191378910 ] Request (main): 201 POST https://glance-default-public-openstack.apps-crc.testing/v2/images 0.127s\e[00m" - "2026-01-22 16:53:30.698 13 INFO tempest.lib.common.rest_client [req-cf88e403-6c67-4338-b823-b182193aca7c req-cf88e403-6c67-4338-b823-b182193aca7c ] Request (main): 204 PUT https://glance-default-public-openstack.apps-crc.testing/v2/images/a33c2bad-821b-43f1-aa77-518d2843bb18/file 1.048s\e[00m" - "2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [scenario] img_file = /var/lib/tempest/openshift/etc/cirros-0.6.2-x86_64-disk.img set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [compute] image_ref = e1b65bbe-5c14-4552-a5d9-d275c9dd42d3 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:30.699 13 DEBUG config_tempest.constants [-] Setting [compute] image_ref_alt = a33c2bad-821b-43f1-aa77-518d2843bb18 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:30.699 13 INFO config_tempest.constants [-] Setting up network\e[00m" - "2026-01-22 16:53:30.699 13 INFO config_tempest.constants [-] No network supplied, trying auto discover for an external network while prioritizing the one called public, if not found, the network discovered last will be used.\e[00m" - "2026-01-22 16:53:31.347 13 INFO tempest.lib.common.rest_client [req-6b91ca0f-9aae-46d3-8d50-b7a76dacfdd6 req-6b91ca0f-9aae-46d3-8d50-b7a76dacfdd6 ] Request (main): 200 GET https://neutron-public-openstack.apps-crc.testing/v2.0/networks 0.648s\e[00m" - "2026-01-22 16:53:31.348 13 INFO config_tempest.constants [-] Setting 9663874c-fdcf-40bd-bc5f-873b4ba46792 as the public network for tempest\e[00m" - "2026-01-22 16:53:31.348 13 DEBUG config_tempest.constants [-] Setting [network] public_network_id = 9663874c-fdcf-40bd-bc5f-873b4ba46792 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.348 13 DEBUG config_tempest.constants [-] Setting [network] floating_network_name = public set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.416 13 INFO tempest.lib.common.rest_client [req-b4bebe47-26d2-4136-b6bb-76786a557bb7 req-b4bebe47-26d2-4136-b6bb-76786a557bb7 ] Request (main): 200 GET https://cinder-public-openstack.apps-crc.testing/v3/os-services?binary=cinder-backup 0.068s\e[00m" - "2026-01-22 16:53:31.418 13 DEBUG config_tempest.constants [-] Setting [volume-feature-enabled] backup = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.436 13 DEBUG config_tempest.constants [-] Setting [service_available] horizon = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_v2 = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_v3 = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] api_v1 = False set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.437 13 DEBUG config_tempest.constants [-] Setting [image-feature-enabled] api_v2 = True set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [identity-feature-enabled] api_extensions = OS-OAUTH2,OS-ENDPOINT-POLICY,OS-EC2,OS-SIMPLE-CERT,OS-PKI,OS-REVOKE,s3tokens,OS-FEDERATION,OS-INHERIT,OS-TRUST,OS-EP-FILTER,OS-OAUTH1 set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [network-feature-enabled] api_extensions = address-group,address-scope,agent,allowed-address-pairs,auto-allocated-topology,availability_zone,default-subnetpools,dhcp_agent_scheduler,dns-integration,dns-domain-ports,dns-integration-domain-keywords,expose-port-forwarding-in-fip,external-net,extra_dhcp_opt,extraroute,filter-validation,floating-ip-port-forwarding-description,floating-ip-port-forwarding-detail,floating-ip-port-forwarding-port-ranges,fip-port-details,flavors,floating-ip-port-forwarding,floatingip-pools,ip_allocation,l2_adjacency,router,ext-gw-mode,logging,multi-provider,net-mtu,net-mtu-writable,network_availability_zone,network-ip-availability,pagination,port-device-profile,port-mac-address-regenerate,port-numa-affinity-policy,port-resource-request,port-resource-request-groups,binding,binding-extended,port-security,project-id,provider,qos,qos-bw-limit-direction,qos-bw-minimum-ingress,qos-default,qos-fip,qos-gateway-ip,qos-port-network-policy,qos-pps-minimum,qos-pps-minimum-rule-alias,qos-pps,qos-rule-type-details,qos-rule-type-filter,qos-rules-alias,quota-check-limit,quotas,quota_details,rbac-policies,rbac-address-scope,rbac-security-groups,revision-if-match,standard-attr-revisions,router_availability_zone,security-groups-normalized-cidr,security-groups-remote-address-group,security-groups-shared-filtering,security-group,segment,segments-peer-subnet-host-routes,service-type,sorting,standard-attr-segment,standard-attr-description,stateful-security-group,subnet-dns-publish-fixed-ip,subnet-segmentid-writable,subnet-service-types,subnet_allocation,subnetpool-prefix-ops,standard-attr-tag,standard-attr-timestamp,trunk,trunk-details set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [object-storage-feature-enabled] discoverable_apis = symlink,versioned_writes,slo,account_quotas,container_quotas,staticweb,s3api,formpost,ratelimit,tempurl,bulk_upload,bulk_delete set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.462 13 DEBUG config_tempest.constants [-] Setting [volume-feature-enabled] api_extensions = os-admin-actions,os-availability-zone,backups,capabilities,cgsnapshots,consistencygroups,os-extended-services,os-extended-snapshot-attributes,os-hosts,qos-specs,os-quota-class-sets,os-quota-sets,OS-SCH-HNT,scheduler-stats,os-services,os-snapshot-actions,os-snapshot-manage,os-snapshot-unmanage,os-types-extra-specs,os-types-manage,os-used-limits,os-volume-actions,os-volume-encryption-metadata,os-vol-host-attr,os-vol-image-meta,os-volume-manage,os-vol-mig-status-attr,os-vol-tenant-attr,os-volume-transfer,os-volume-type-access,encryption,os-volume-unmanage set /usr/lib/python3.9/site-packages/config_tempest/tempest_conf.py:105\e[00m" - "2026-01-22 16:53:31.462 13 INFO config_tempest.constants [-] Creating configuration file /var/lib/tempest/openshift/etc/tempest.conf\e[00m" - '{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_list_supported_logging_types [0.479288s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_log_deleted_with_corresponding_security_group [1.452320s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_logging.LoggingTestJSON.test_log_lifecycle [1.823059s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_extension_driver_port_security_admin.PortSecurityAdminTests.test_create_port_security_false_on_shared_network [4.686974s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.api.admin.test_agent_availability_zone.AgentAvailabilityZoneTestCase) ... SKIPPED: availability_zone supported agent not found.' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_delete_agent_negative [0.153653s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_list_agent [0.264714s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_list_agents_non_admin [0.577192s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_show_agent [0.200018s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_invalid_resource_type [0.477196s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_nonexistent_port [0.204151s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_logging_negative.LoggingNegativeTestJSON.test_create_log_with_nonexistent_sg [0.071226s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_update_agent_description [0.413646s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_agent_management.AgentManagementTestJSON.test_update_agent_status [0.198925s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.api.admin.test_ports.PortTestCasesResourceRequest) ... SKIPPED: Skipped as provider VLANs are not available in config' - '{3} setUpClass (neutron_tempest_plugin.api.admin.test_default_security_group_rules.DefaultSecurityGroupRuleTest) ... SKIPPED: security-groups-default-rules extension not enabled.' - '{3} setUpClass (neutron_tempest_plugin.api.admin.test_l3_agent_scheduler.L3AgentSchedulerTestJSON) ... SKIPPED: l3_agent_scheduler extension not enabled.' - '{1} neutron_tempest_plugin.api.admin.test_shared_network_extension.AllowedAddressPairSharedNetworkTest.test_create_with_address_pair_blocked_on_other_network [0.689643s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_ports.PortTestCasesAdmin.test_regenerate_mac_address [1.583062s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_shared_network_extension.AllowedAddressPairSharedNetworkTest.test_update_with_address_pair_blocked_on_other_network [1.413274s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_delete_policies_while_tenant_attached_to_net [10.148618s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_ports.PortTestCasesAdmin.test_update_mac_address [1.614354s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_one_policy_delete [4.053436s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_policy_create [2.500601s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.api.admin.test_routers_flavors.RoutersFlavorTestCase) ... SKIPPED: l3-flavors extension not enabled.' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_conversion_on_policy_delete [2.225582s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_network_on_shared_policy_delete [2.347570s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterQosPolicyTestJSON.test_filter_qos_policy_tags [2.552652s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_external_update_policy_from_wildcard_to_specific_tenant [5.921747s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_policy_allows_tenant_to_allocate_floatingip [4.590408s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterFloatingIpTestJSON.test_filter_floatingip_tags [1.239771s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_policy_allows_tenant_to_attach_ext_gw [6.120847s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterSecGroupTestJSON.test_filter_security_group_tags [1.103136s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_regular_client_blocked_from_creating_external_wild_policies [1.276411s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_regular_client_shares_with_another [2.852103s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_wildcard_policy_created_from_external_network_api [4.921274s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_external_network_extension.ExternalNetworksRBACTestJSON.test_wildcard_policy_delete_blocked_on_default_ext [1.313892s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_tag.TagFilterSubnetpoolTestJSON.test_filter_subnetpool_tags [1.073496s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterNetworkTestJSON.test_filter_network_tags [1.689970s] ... ok' - '{0} tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops [51.919366s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.api.admin.test_network_segment_range.NetworkSegmentRangeTestJson) ... SKIPPED: network-segment-range extension not enabled.' - '{1} neutron_tempest_plugin.api.admin.test_tag.TagPortTestJSON.test_port_tags [2.683534s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project [1.897819s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project_and_other_tenant [0.124799s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_network_with_project_and_tenant [0.870146s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_networks.NetworksTestAdmin.test_create_tenant_network_vxlan ... SKIPPED: VXLAN type_driver is not enabled' - '{2} neutron_tempest_plugin.api.admin.test_floating_ips_admin_actions.FloatingIPAdminTestJSON.test_associate_floating_ip_with_port_from_another_project [4.551834s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_floating_ips_admin_actions.FloatingIPAdminTestJSON.test_create_floatingip_with_specified_ip_address [4.462521s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_tag.TagRouterTestJSON.test_router_tags [5.735888s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_tag.TagFilterTrunkTestJSON.test_filter_trunk_tags [1.490591s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_floatingip_when_quotas_is_full [2.503035s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_network_when_quotas_is_full [2.959785s] ... ok' - '{1} neutron_tempest_plugin.api.admin.test_tag.UpdateTagsTest.test_update_tags_affects_only_updated_resource [4.300004s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_port_when_quotas_is_full [6.381304s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_router_when_quotas_is_full [2.302207s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_security_group_rule_when_quotas_is_full [2.358021s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_quotas.QuotasTest.test_detail_quotas [4.271578s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_quotas.QuotasTest.test_quotas [1.299900s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_security_group_when_quotas_is_full [1.792787s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_tag.TagQosPolicyTestJSON.test_qos_policy_tags [2.972602s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_quotas_negative.QuotasAdminNegativeTestJSON.test_create_subnet_when_quotas_is_full [4.551952s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_security_groups.SecGroupAdminTest.test_security_group_recreated_on_port_update [7.610181s] ... ok' - '{3} neutron_tempest_plugin.api.admin.test_tag.TagTrunkTestJSON.test_trunk_tags [4.409673s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.api.admin.test_routers_ha.RoutersTestHA) ... SKIPPED: l3-ha extension not enabled.' - '{1} neutron_tempest_plugin.api.test_auto_allocated_topology.TestAutoAllocatedTopology.test_delete_allocated_net_topology_as_tenant [23.676270s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_delete_address_scope_associated_with_subnetpool [2.185197s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_delete_non_existent_address_scope [0.102985s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_get_non_existent_address_scope [0.102643s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_create_shared_address_scope [0.078539s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_create_rbac_policy_with_target_tenant_none [5.283354s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_get_not_shared_admin_address_scope [0.542851s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_update_address_scope_shared_false [0.238029s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_tenant_update_address_scope_shared_true [0.268173s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_update_non_existent_address_scope [0.108735s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_tag.TagSecGroupTestJSON.test_security_group_tags [2.561537s] ... ok' - '{3} neutron_tempest_plugin.api.test_address_scopes_negative.AddressScopeTestNegative.test_update_shared_address_scope_to_unshare [0.239790s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_create_rbac_policy_with_target_tenant_too_long_id [4.151625s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_delete_self_share_rule [2.529939s] ... ok' - '{1} neutron_tempest_plugin.api.test_auto_allocated_topology.TestAutoAllocatedTopology.test_get_allocated_net_topology_as_tenant [13.724974s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_create_update_delete_flavor [0.594934s] ... ok' - '{2} neutron_tempest_plugin.api.admin.test_tag.TagSubnetPoolTestJSON.test_subnetpool_tags [3.147804s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_create_update_delete_service_profile [0.810369s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_list_flavors [0.328330s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_list_service_profiles [0.060740s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_show_flavor [0.168775s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_duplicate_policy_error [5.171543s] ... ok' - '{3} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsIpV6TestJSON.test_show_service_profile [0.104093s] ... ok' - '{2} setUpClass (neutron_tempest_plugin.api.test_address_groups.RbacSharedAddressGroupTest) ... SKIPPED: rbac-address-group extension not enabled.' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filter_fields [2.912524s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filter_policies [1.940854s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.api.test_conntrack_helper.ConntrackHelperTestJSON) ... SKIPPED: l3-conntrack-helper extension not enabled.' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_admin_create_shared_address_scope [1.811667s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_admin_update_shared_address_scope [0.528687s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_delete_address_scope [0.710185s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_show_address_scope [0.458182s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_show_address_scope_project_id [0.210792s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_tenant_create_list_address_scope [0.424569s] ... ok' - '{2} neutron_tempest_plugin.api.test_address_scopes.AddressScopeTest.test_tenant_update_address_scope [0.593759s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_filtering_works_with_rbac_records_present [5.826728s] ... ok' - '{2} setUpClass (neutron_tempest_plugin.api.test_metering_extensions.MeteringIpV6TestJSON) ... SKIPPED: metering extension not enabled.' - '{2} setUpClass (neutron_tempest_plugin.api.test_metering_negative.MeteringNegativeTestJSON) ... SKIPPED: metering extension not enabled.' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_network_only_visible_to_policy_target [3.931753s] ... ok' - '{1} neutron_tempest_plugin.api.test_floating_ips.FloatingIPPoolTestJSON.test_create_floatingip_from_specific_pool [7.174111s] ... ok' - '{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_blank_update_clears_association [2.820457s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_policy_show [4.711084s] ... ok' - '{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_create_update_floatingip_description [4.832600s] ... ok' - '{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_after_port_delete [5.632098s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_policy_target_update [4.468503s] ... ok' - '{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_after_subnet_and_ports [3.751487s] ... ok' - '{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_create_update_floatingip_port_details [4.172069s] ... ok' - '{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_list_ip_availability_before_subnet [0.889783s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_after_port_delete [4.330884s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_port_presence_prevents_network_rbac_policy_deletion [6.608564s] ... ok' - '{3} neutron_tempest_plugin.api.test_floating_ips.FloatingIPTestJSON.test_floatingip_update_extra_attributes_port_id_not_changed [3.460644s] ... ok' - '{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_show_ip_availability_after_port_delete [3.169123s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_after_subnet_and_ports [2.015763s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ip_availability_before_subnet [0.948350s] ... ok' - '{2} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv4Test.test_show_ip_availability_after_subnet_and_ports_create [2.543879s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_list_ipv6_ip_availability_after_subnet_and_ports [2.114246s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_rbac_bumps_network_revision [5.175137s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_show_ip_availability_after_port_delete [2.872928s] ... ok' - '{1} neutron_tempest_plugin.api.test_network_ip_availability.NetworksIpAvailabilityIPv6Test.test_show_ip_availability_after_subnet_and_ports_create [2.154625s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_rbac_policy_quota [5.613359s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.api.test_local_ip.LocalIPAssociationTestJSON) ... SKIPPED: local_ip extension not enabled.' - '{3} setUpClass (neutron_tempest_plugin.api.test_local_ip.LocalIPTestJSON) ... SKIPPED: local_ip extension not enabled.' - '{3} setUpClass (neutron_tempest_plugin.api.test_ndp_proxy.NDPProxyTestJSON) ... SKIPPED: l3-ndp-proxy extension not enabled.' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_blocked_from_sharing_anothers_network [4.413670s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_illegal_ip [0.105194s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_network_id [0.114777s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_security_groups_id [0.153464s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_nonexist_tenant_id [0.086532s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_too_long_description [0.093438s] ... ok' - '{2} neutron_tempest_plugin.api.test_ports_negative.PortsNegativeTestJSON.test_add_port_with_too_long_name [0.076172s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_blocked_from_sharing_with_wildcard [2.259728s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_regular_client_shares_to_another_regular_client [1.647954s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_no_pagination_limit_0 [0.172839s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination [0.898385s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.399728s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.514045s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.350247s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_subnet_on_network_only_visible_to_policy_target [3.792578s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_create_policy [1.022506s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_attach_and_detach_a_policy_by_a_tenant ... SKIPPED: Creation of shared resources should be allowed,' - ' setting the create_shared_resources option as ''True'' is needed' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_default_policy_creating_network_with_policy [3.696015s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_with_href_links [4.279758s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_tenant_can_delete_port_on_own_network [5.077489s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_get_rules_by_policy [4.757349s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_pagination_with_marker [1.953801s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_sorts_asc [0.158171s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_sorts_desc [0.852153s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create [1.846129s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks.NetworksSearchCriteriaTest.test_list_validation_filters [0.210388s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_tenant_cant_delete_other_tenants_ports [2.457686s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_default_policy_creating_network_without_policy [3.838242s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_fail_for_the_same_type [0.925686s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.157129s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.130695s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_not_allowed_if_policy_in_use_by_network [0.977860s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_delete [1.516228s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update [1.169838s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.RBACSharedNetworksTest.test_update_self_share_rule [4.114812s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update_forbidden_for_regular_tenants_foreign_policy [0.783567s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_not_allowed_if_policy_in_use_by_port [3.269851s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_delete_policy [0.487114s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleTestJSON.test_rule_update_forbidden_for_regular_tenants_own_policy [0.760475s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_get_policy_that_is_shared [0.294097s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_admin_rule_types [0.080477s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_policy_filter_by_name [1.044553s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_policy_sort_by_name [1.125442s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_list_regular_rule_types [0.113008s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_admin_network [2.489080s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_network_non_shared_policy [0.413875s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_network_nonexistent_policy [0.316380s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_non_shared_policy [1.604576s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks_negative.NetworksNegativeTest.test_delete_network_in_use [1.526792s] ... ok' - '{3} neutron_tempest_plugin.api.test_networks_negative.NetworksNegativeTest.test_update_network_mtu [0.359368s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_nonexistent_policy [0.888316s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_attach_and_detach_a_policy_by_a_tenant ... SKIPPED: Creation of shared resources should be allowed,' - ' setting the create_shared_resources option as ''True'' is needed' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_port_shared_policy [2.016138s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_association_with_tenant_network [2.065159s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_create_policy_with_multiple_rules [2.728317s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_create_forbidden_for_regular_tenants [0.118396s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update [0.376464s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_get_rules_by_policy [1.290158s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create [1.003631s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_association_with_admin_network [1.988622s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_fail_for_the_same_type [0.622834s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_forbidden_for_regular_tenants [0.124058s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_create_rule_nonexistent_policy [0.138091s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_delete [1.119572s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_association_with_port_shared_policy [2.277442s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_forbidden_for_regular_tenants_foreign_policy [0.235575s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update [0.917897s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_policy_update_forbidden_for_regular_tenants_own_policy [0.472488s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_bulk_shared_network [2.547282s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_qos_policy_delete_with_rules [0.570282s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_1 [0.812254s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_shared_policy_update [0.714555s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_2 [1.046047s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_policy_has_project_id [0.526008s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_rule_type_details_as_admin [0.097554s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_show_rule_type_details_as_user [0.100763s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_port_shared_network_as_non_admin_tenant [2.068579s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_forbidden_for_regular_tenants_foreign_policy [0.670657s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosBandwidthLimitRuleWithDirectionTestJSON.test_rule_update_forbidden_for_regular_tenants_own_policy [0.549154s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_create_update_shared_network [1.411960s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.QosTestJSON.test_user_create_port_with_admin_qos_policy [1.912578s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_mapping_different_external_ports_to_the_same_destination [2.782070s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_filtering_shared_networks [1.540699s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_mapping_same_fip_and_external_port_to_different_dest [2.736586s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwarding_negative.PortForwardingNegativeTestJSON.test_out_of_range_ports [2.756117s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_filtering_shared_subnets [5.189565s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_list_shared_networks [0.956732s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_shared_network_extension.SharedNetworksTest.test_show_shared_networks_attribute [0.333569s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_no_pagination_limit_0 [0.662606s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination [1.112544s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.282426s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.330405s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.451963s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_with_href_links [2.307660s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_pagination_with_marker [1.356914s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_sorts_asc [0.520452s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos.QosSearchCriteriaTest.test_list_sorts_desc [1.604479s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_filter_fields [1.926834s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.461581s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.858768s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosMinimumPpsRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.245830s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_filter_rbac_policies [2.490463s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterPortTestJSON.test_filter_port_tags [5.811020s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_net_bound_shared_policy_wildcard_and_project_id_wild_remove [7.546834s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_net_bound_shared_policy_wildcard_and_projectid_wild_remains [4.456532s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_description [0.448426s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_name [0.080204s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_add_policy_with_too_long_tenant_id [0.106243s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_2_port_forwardings_to_floating_ip [7.367977s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_delete_non_existent_qos_policy [0.967944s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_get_non_existent_qos_policy [0.183500s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_non_existent_qos_policy [0.112371s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_policy_with_too_long_description [0.271208s] ... ok' - '{1} neutron_tempest_plugin.api.test_qos_negative.QosNegativeTestJSON.test_update_policy_with_too_long_name [0.209588s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_network_presence_prevents_policy_rbac_policy_deletion [4.603639s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTestToCentralized) ... SKIPPED: dvr extension not enabled.' - '{1} setUpClass (neutron_tempest_plugin.api.test_routers.HaRoutersTest) ... SKIPPED: l3-ha extension not enabled.' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_2_fixed_ips [4.991159s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_sharing_with_wildcard [3.400072s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_port_with_fip [2.914703s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_sharing_with_wildcard_and_project_id [2.470462s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_policy_target_update [0.851857s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_rbac_policy_show [1.197836s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_associate_port_forwarding_to_used_floating_ip [3.814490s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterRouterTestJSON.test_filter_router_tags [1.939613s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_regular_client_blocked_from_sharing_anothers_policy [1.080434s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_router_with_default_snat_value [5.072216s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos.RbacSharedQosPoliciesTest.test_regular_client_shares_to_another_regular_client [1.126698s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_info_in_fip_details [7.066597s] ... ok' - '{3} neutron_tempest_plugin.api.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_life_cycle [5.191294s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.525829s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_router_with_snat_explicit [11.750769s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.461035s] ... ok' - '{2} neutron_tempest_plugin.api.test_qos_negative.QosBandwidthLimitRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.467391s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_create_update_router_description [1.761838s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_extra_routes_atomic ... SKIPPED: Skipped because network extension: extraroute-atomic is not enabled' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_portbinding_bumps_revision [6.601580s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_allowed_address_pairs_bumps_revision [3.984981s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_network_attached_with_two_routers [18.022642s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_dns_domain_bumps_revision [4.356534s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagFilterSubnetTestJSON.test_filter_subnet_tags [1.601336s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_router_interface_status [5.908452s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_external_network_bumps_revision [2.003038s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.169012s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination [1.060484s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_extra_dhcp_opt_bumps_revision [4.675054s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.440410s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.653336s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.628723s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_with_href_links [2.329151s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_router_interface_update_and_remove_gateway_ip [9.087182s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_pagination_with_marker [1.725382s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_sorts_asc [0.223099s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsSearchCriteriaTest.test_list_sorts_desc [0.202794s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagFloatingIpTestJSON.test_floatingip_tags [2.750031s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_floatingip_bumps_revision [13.259553s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_extra_route [9.014233s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_network_bumps_revision [2.280040s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_network_constrained_by_revision [2.051329s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_add_ips_to_port [3.621467s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_bumps_revision [3.230490s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_reset_gateway_without_snat [7.723832s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagNetworkTestJSON.test_network_tags [2.935376s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_change_dhcp_flag_then_create_port [4.251096s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_description [2.085041s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_security_bumps_revisions [5.718283s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_security [1.663191s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_set_gateway_with_snat_explicit [6.671504s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_port_sg_binding_bumps_revision [6.121588s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_dns_domain [6.447065s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_qos_network_policy_binding_bumps_revision [3.190510s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_dns_name [2.815781s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersIpV6Test.test_update_router_set_gateway_without_snat [9.597167s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_create_update_port_with_no_dns_name [2.326521s] ... ok' - '{0} neutron_tempest_plugin.api.admin.test_tag.TagSubnetTestJSON.test_subnet_tags [2.973892s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_qos_port_policy_binding_bumps_revision [3.429237s] ... ok' - '{3} neutron_tempest_plugin.api.test_ports.PortsTestJSON.test_port_shut_down ... SKIPPED: At least one DHCP agent is required to be running in the environment for this test.' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_bumps_revision [8.760727s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_extra_attributes_bumps_revision ... SKIPPED: Skipped because network extension: l3-ha is not enabled' - '{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_add_wrong_address_to_address_group [2.402755s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_address_group_create_with_wrong_address [0.219410s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_address_group_lifecycle [0.778784s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_edit_addresses_in_address_group [0.876919s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_groups.AddressGroupTest.test_remove_wrong_address_from_address_group [1.616347s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_router_with_default_snat_value [6.503243s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_router_extra_routes_bumps_revision [9.675252s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_get_rules_by_policy [1.837383s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_sg_group_bumps_revision [0.736822s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create [1.552827s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_fail_for_missing_min_kbps [0.130022s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_sg_rule_bumps_sg_revision [1.916058s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_fail_for_the_same_type [0.719677s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.271059s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_filter_fields [1.914653s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_pass_for_direction_ingress [0.502831s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.101398s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_delete [0.963674s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_filter_rbac_policies [1.119058s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_policy_target_update [0.538627s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumBandwidthRuleTestJSON.test_rule_update [1.255874s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_bumps_network_revision [3.560233s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_rbac_policy_show [2.044851s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_regular_client_blocked_from_sharing_anothers_policy [0.523255s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_regular_client_shares_to_another_regular_client [1.571893s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_router_with_snat_explicit [10.975996s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_bumps_revision [4.212800s] ... ok' - '{0} neutron_tempest_plugin.api.test_address_scopes.RbacAddressScopeTest.test_subnet_pool_presence_prevents_rbac_policy_deletion [1.816350s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_create_update_router_description [1.543181s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_extra_routes_atomic ... SKIPPED: Skipped because network extension: extraroute-atomic is not enabled' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnet_service_types_bumps_revisions [4.217279s] ... ok' - '{2} neutron_tempest_plugin.api.test_revisions.TestRevisions.test_update_subnetpool_bumps_revision [0.750047s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_get_rules_by_policy [2.311268s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create [0.944467s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_any_direction_when_egress_direction_exists [0.717572s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_egress_direction_when_any_direction_exists [1.034693s] ... ok' - '{0} neutron_tempest_plugin.api.test_availability_zones.ListAvailableZonesTest.test_list_available_zones [0.497800s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_fail_for_missing_min_kpps [0.280078s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_fail_for_the_same_type [0.976303s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.713278s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_delete [1.514824s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_update [1.268139s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_network_attached_with_two_routers [14.520443s] ... ok' - '{2} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTest) ... SKIPPED: dvr extension not enabled.' - '{2} setUpClass (neutron_tempest_plugin.api.test_routers_negative.HaRoutersNegativeTest) ... SKIPPED: l3-ha extension not enabled.' - '{3} neutron_tempest_plugin.api.test_qos.QosMinimumPpsRuleTestJSON.test_rule_update_direction_conflict [1.536655s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_allowed_address_pairs [4.477666s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_create_port_sec_with_security_group [6.647612s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_router_interface_status [10.136300s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [1.169822s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [0.457727s] ... ok' - '{3} neutron_tempest_plugin.api.test_qos_negative.QosDscpRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.322501s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_delete_with_port_sec [4.922496s] ... ok' - '{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_in_use [4.835873s] ... ok' - '{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_port_nonexist [0.245110s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_default_value [1.866282s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_specific_value_1 [4.418949s] ... ok' - '{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_add_interface_wrong_tenant [6.480649s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_specific_value_2 [2.663435s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_router_interface_update_and_remove_gateway_ip [15.605425s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_update_pass [4.418553s] ... ok' - '{2} neutron_tempest_plugin.api.test_routers_negative.RoutersNegativePolicyTest.test_remove_associated_ports [12.121457s] ... ok' - '{0} neutron_tempest_plugin.api.test_extension_driver_port_security.PortSecTest.test_port_sec_update_port_failed [6.517988s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_extra_route [11.304971s] ... ok' - '{3} neutron_tempest_plugin.api.test_routers.RoutersDeleteTest.test_delete_router [17.238146s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_reset_gateway_without_snat [10.983228s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_set_gateway_with_snat_explicit [17.875504s] ... ok' - '{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_includes_all [0.528774s] ... ok' - '{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_pagination [0.156379s] ... ok' - '{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_project_id [0.084432s] ... ok' - '{0} neutron_tempest_plugin.api.test_extensions.ExtensionsTest.test_list_extensions_sorting [0.069959s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_filter_fields [4.639863s] ... ok' - '{1} neutron_tempest_plugin.api.test_routers.RoutersTest.test_update_router_set_gateway_without_snat [10.300219s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_filter_rbac_policies [1.106386s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_policy_target_update [0.881881s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_create_update_delete_flavor [0.585025s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_create_update_delete_service_profile [0.421364s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_list_flavors [0.098676s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_list_service_profiles [0.073018s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_show_flavor [0.133334s] ... ok' - '{0} neutron_tempest_plugin.api.test_flavors_extensions.TestFlavorsJson.test_show_service_profile [0.125014s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_port_presence_prevents_policy_rbac_policy_deletion [2.809717s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_create_sg_rules_when_quota_disabled [13.863572s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_rbac_policy_show [1.102096s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_regular_client_blocked_from_sharing_anothers_policy [1.475566s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.api.test_routers_negative.DvrRoutersNegativeTest) ... SKIPPED: dvr extension not enabled.' - '{1} setUpClass (neutron_tempest_plugin.api.test_routers_negative.DvrRoutersNegativeTestExtended) ... SKIPPED: dvr extension not enabled.' - '{2} neutron_tempest_plugin.api.test_security_groups.RbacSharedSecurityGroupTest.test_regular_client_shares_to_another_regular_client [2.555682s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_decrease_less_than_created [7.546671s] ... ok' - '{1} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupRulesQuotaTest.test_sg_creation_with_insufficient_sg_rules_quota [1.206775s] ... ok' - '{1} neutron_tempest_plugin.api.test_service_type_management.ServiceTypeManagementTest.test_service_provider_list [0.457842s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.api.test_subnetpools.RbacSubnetPoolTest) ... SKIPPED: rbac-subnetpool extension not enabled.' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupNormalizedCidrTest.test_normalized_cidr_in_rule [1.846059s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.116002s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination [0.586737s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.355545s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.216798s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.429008s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_with_href_links [1.618602s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_pagination_with_marker [0.656340s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_sorts_asc [0.099070s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_sorts_desc [0.120912s] ... ok' - '{0} neutron_tempest_plugin.api.test_floating_ips_negative.FloatingIPNegativeTestJSON.test_associate_floatingip_with_port_with_floatingip [12.073557s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsSearchCriteriaTest.test_list_validation_filters [0.160927s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_increased [18.821029s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.SecGroupRulesQuotaTest.test_sg_rules_quota_values [1.488369s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_create_sg_when_quota_disabled [10.149166s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_admin_create_shared_subnetpool [1.011119s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_list_subnetpool [0.360832s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_default_prefixlen [2.250085s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolIPv6Test.test_security_group_rule_protocol_legacy_icmpv6 [2.366803s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.api.test_metering_extensions.MeteringTestJSON) ... SKIPPED: metering extension not enabled.' - '{0} setUpClass (neutron_tempest_plugin.api.test_ndp_proxy_negative.NDPProxyNegativeTestJSON) ... SKIPPED: l3-ndp-proxy extension not enabled.' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_prefixlen [2.890566s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_quota [2.385769s] ... ok' - '{0} neutron_tempest_plugin.api.test_network_ip_availability_negative.NetworksIpAvailabilityNegativeTest.test_network_availability_nonexistent_network_id [1.071365s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_max_allowed_sg_amount [9.223481s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnet_from_pool_with_subnet_cidr [3.959933s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_subnetpool_associate_address_scope [0.495488s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_create_update_subnetpool_description [0.956725s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_get_subnetpool [0.389932s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_show_subnetpool_has_project_id [0.389678s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_tenant_create_non_default_subnetpool [0.316511s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_tenant_update_subnetpool [0.375629s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_associate_address_scope [0.603394s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksMtuTestJSON.test_create_network_custom_mtu [1.564202s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_associate_another_address_scope [0.859904s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_disassociate_address_scope [0.638586s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksMtuTestJSON.test_update_network_custom_mtu [1.234435s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_prefixes_append [0.345434s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_decrease_less_than_created [8.530250s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolTest.test_security_group_rule_protocol_ints [9.900287s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTest.test_update_subnetpool_prefixes_extend [0.421089s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_create_network_with_project [1.168753s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupProtocolTest.test_security_group_rule_protocol_names [7.962857s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_create_update_network_dns_domain [1.492245s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_list_networks_fields_keystone_v3 [0.902678s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_show_network [0.245780s] ... ok' - '{0} neutron_tempest_plugin.api.test_networks.NetworksTestJSON.test_show_network_fields_keystone_v3 [0.895733s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolIPv6Test.test_security_group_rule_protocol_legacy_icmpv6 [2.503980s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_increased [16.012126s] ... ok' - '{1} neutron_tempest_plugin.api.test_subnets.SubnetServiceTypeTestJSON.test_allocate_ips_are_from_correct_subnet [3.561559s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.SecGroupQuotaTest.test_sg_quota_values [2.483802s] ... ok' - '{0} neutron_tempest_plugin.api.test_ports.PortsIpv6TestJSON.test_add_ipv6_ips_to_port [1.939455s] ... ok' - '{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_bulk_creation ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled' - '{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_bulk_creation_no_tags ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled' - '{0} neutron_tempest_plugin.api.test_ports.PortsTaggingOnCreationTestJSON.test_tagging_ports_during_creation ... SKIPPED: Skipped because network extension: tag-ports-during-bulk-creation is not enabled' - '{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolTest.test_security_group_rule_protocol_ints [8.725653s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_bulk_sec_groups [1.520815s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_list_update_show_delete_security_group [0.914927s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_add_subports [7.229157s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_get_rules_by_policy [2.041917s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_invalid_rule_create [0.189792s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_create_sec_groups_with_the_same_name [2.613985s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create [0.669689s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupProtocolTest.test_security_group_rule_protocol_names [8.242699s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_fail_for_the_same_type [0.421599s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_forbidden_for_regular_tenants [0.066699s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_create_rule_nonexistent_policy [0.093933s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_delete [1.564835s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_show_delete_trunk [3.676733s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_list_security_group_rules_contains_all_rules [2.519428s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos.QosDscpMarkingRuleTestJSON.test_rule_update [0.809947s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_show_security_group_contains_all_rules [0.957291s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups.StatelessSecGroupTest.test_stateless_security_group_update [2.001341s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_trunk_empty_subports_list [3.446350s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupQuotaTest.test_create_excess_sg [1.883090s] ... ok' - '{3} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupQuotaTest.test_sg_quota_incorrect_values [1.866960s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_trunk_subports_not_specified [5.127519s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_create_rule_non_existent_policy [0.636088s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_update_rule_nonexistent_policy [1.448824s] ... ok' - '{0} neutron_tempest_plugin.api.test_qos_negative.QosMinimumBandwidthRuleNegativeTestJSON.test_rule_update_rule_nonexistent_rule [0.353136s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_update_trunk [6.095812s] ... ok' - '{3} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv4Test.test_add_overlapping_prefix [0.849248s] ... ok' - '{3} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv4Test.test_add_remove_prefix [0.653313s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_assign_nonexistent_sec_group [1.801077s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_create_update_trunk_with_description [4.359475s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_assign_sec_group_twice [2.561247s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_create_security_group_with_boolean_type_name [0.176918s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_create_security_group_with_too_long_name [0.049159s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_delete_in_use_sec_group [1.139535s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.api.test_router_interface_fip.RouterInterfaceFip) ... SKIPPED: Skipped because network extension: router-interface-fip is not enabled' - '{0} setUpClass (neutron_tempest_plugin.api.test_routers.DvrRoutersTestUpdateDistributedExtended) ... SKIPPED: dvr extension not enabled.' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_no_sec_group_changes_after_assignment_failure [1.913982s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_default_security_group_name [0.528390s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_security_group_with_boolean_type_name [0.427186s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupIPv6Test.test_update_security_group_with_too_long_name [0.346642s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_delete_trunk_with_subport_is_allowed [6.266959s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_network_with_timestamp [2.070363s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_no_pagination_limit_0 [0.187413s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination [0.904493s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_port_with_timestamp [1.871069s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.245023s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.331584s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.617031s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_subnet_with_timestamp [2.802274s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_create_subnetpool_with_timestamp [1.111835s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_with_href_links [4.778374s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_segment_with_timestamp [2.577400s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_get_subports [9.679438s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_pagination_with_marker [1.170398s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_sorts_asc [0.135874s] ... ok' - '{0} neutron_tempest_plugin.api.test_routers.RoutersSearchCriteriaTest.test_list_sorts_desc [0.147998s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_networks_attribute_with_timestamp [0.942313s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_port_attribute_with_timestamp [1.361725s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_assign_nonexistent_sec_group [3.788244s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_subnet_attribute_with_timestamp [4.909637s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_show_subnetpool_attribute_with_timestamp [0.374425s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_assign_sec_group_twice [2.266525s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_create_security_group_with_boolean_type_name [0.113535s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_create_security_group_with_too_long_name [0.065795s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_network_with_timestamp [1.739238s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_list_trunks [9.502239s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_delete_in_use_sec_group [1.437100s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_port_with_timestamp [1.722476s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_no_sec_group_changes_after_assignment_failure [1.818715s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_default_security_group_name [0.603958s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_security_group_with_boolean_type_name [0.384920s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupTest.test_update_security_group_with_too_long_name [0.293602s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_subnet_with_timestamp [2.775865s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStamp.test_update_subnetpool_with_timestamp [0.377209s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_no_pagination_limit_0 [0.174312s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_pagination [0.904920s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_sorts_by_name_asc [0.190485s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.SecGroupSearchCriteriaTest.test_list_sorts_by_name_desc [0.232952s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_remove_subport [10.731028s] ... ok' - '{1} neutron_tempest_plugin.api.test_trunk.TrunkTestJSON.test_show_trunk_has_project_id [4.056402s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_adding_stateful_sg_to_port_with_stateless_sg [1.080163s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_bulk_sec_groups [1.959328s] ... ok' - '{1} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationTests) ... SKIPPED: DNSIntegrationTests skipped as designate is not available' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_adding_stateless_sg_to_port_with_stateful_sg [1.047018s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_list_update_show_delete_security_group [1.127145s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_create_port_with_stateful_and_stateless_sg [0.289251s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_update_used_stateful_sg_to_stateless [0.884506s] ... ok' - '{2} neutron_tempest_plugin.api.test_security_groups_negative.NegativeStatelessSecGroupTest.test_update_used_stateless_sg_to_stateful [0.867683s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_create_sec_groups_with_the_same_name [2.016971s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_create_sg_with_timestamp [0.936924s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_create_sgrule_with_timestamp [0.797873s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_show_sg_attribute_with_timestamp [0.351629s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_list_security_group_rules_contains_all_rules [2.276316s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_show_sgrule_attribute_with_timestamp [0.743161s] ... ok' - '{0} neutron_tempest_plugin.api.test_security_groups.StatefulSecGroupTest.test_show_security_group_contains_all_rules [1.067496s] ... ok' - '{3} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithSecurityGroup.test_update_sg_with_timestamp [2.706891s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv6Test.test_add_overlapping_prefix [0.771568s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.api.test_trunk.TrunkTestMtusJSON) ... SKIPPED: Either vxlan or vlan type driver not enabled.' - '{2} neutron_tempest_plugin.api.test_subnetpool_prefix_ops.SubnetPoolPrefixOpsIpv6Test.test_add_remove_prefix [1.381416s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON) [0.000000s] ... FAILED' - '' - 'Captured traceback:' - ~~~~~~~~~~~~~~~~~~~ - ' Traceback (most recent call last):' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 185, in setUpClass' - ' raise value.with_traceback(trace)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 170, in setUpClass' - ' cls.setup_credentials()' - '' - ' File "/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py", line 117, in setup_credentials' - ' super(BaseNetworkTest, cls).setup_credentials()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 398, in setup_credentials' - ' manager = cls.get_client_manager(' - '' - ' File "/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py", line 89, in get_client_manager' - ' manager = super(BaseNetworkTest, cls).get_client_manager(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 743, in get_client_manager' - ' creds = getattr(cred_provider, credentials_method)()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 473, in get_primary_creds' - ' return self.get_project_member_creds()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 514, in get_project_member_creds' - ' return self.get_credentials([''member''], scope=''project'')' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 436, in get_credentials' - ' credentials = self._create_creds(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 200, in _create_creds' - ' project = self.creds_client.create_project(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/cred_client.py", line 164, in create_project' - ' project = self.projects_client.create_project(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/services/identity/v3/projects_client.py", line 37, in create_project' - ' resp, body = self.post(''projects'', post_body)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 314, in post' - ' resp_header, resp_body = self.request(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 762, in request' - ' self._error_checker(resp, resp_body)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 946, in _error_checker' - ' raise exceptions.UnexpectedResponseCode(str(resp.status),' - '' - ' tempest.lib.exceptions.UnexpectedResponseCode: Unexpected response code received' - 'Details: 504' - '' - '{3} setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestMtusJSON) ... SKIPPED: Either vxlan or vlan type driver not enabled.' - '{3} setUpClass (neutron_tempest_plugin.scenario.test_dvr.NetworkDvrTest) ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{0} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupProtocolTest.test_create_security_group_rule_with_ipv6_protocol_integers [2.543355s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.scenario.test_fip64.Fip64) ... SKIPPED: Skipped because network extension: fip64 is not enabled' - '{0} neutron_tempest_plugin.api.test_security_groups_negative.NegativeSecGroupProtocolTest.test_create_security_group_rule_with_ipv6_protocol_names [1.470036s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_admin_create_shared_subnetpool [1.618370s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_dual_stack_subnets_from_subnetpools [4.784466s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_list_subnetpool [0.485529s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_sp_associate_address_scope_multiple_prefix_intersect [1.151299s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_default_prefixlen [4.934364s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnet_different_pools_same_network [5.321652s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_address_scope_of_other_owner [0.244969s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_address_scope_prefix_intersect [0.960678s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_invalid_address_scope [0.047961s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_create_subnetpool_associate_non_exist_address_scope [0.235721s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_delete_non_existent_subnetpool [0.061448s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_prefixlen [2.229584s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_get_non_existent_subnetpool [0.078766s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_default_subnetpool [0.080447s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_shared_subnetpool [0.082199s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_create_subnetpool_associate_shared_address_scope ... SKIPPED: Test is outdated starting from Ussuri release.' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_get_not_shared_admin_subnetpool [0.442570s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_quota [2.782988s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_tenant_update_sp_prefix_associated_with_shared_addr_scope [2.185374s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_non_existent_subnetpool [0.090400s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_associate_address_scope_of_other_owner [0.372359s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_associate_address_scope_wrong_ip_version [0.414117s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnet_from_pool_with_subnet_cidr [1.997458s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_multiple_prefix_intersect [1.233021s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_not_modifiable_shared [0.170542s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_subnetpool_associate_address_scope [0.364110s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_create_update_subnetpool_description [0.487714s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_get_subnetpool [0.281642s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_show_subnetpool_has_project_id [0.202570s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_prefix_intersect [1.010235s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_tenant_create_non_default_subnetpool [0.096367s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_prefixes_shrink [0.317787s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_tenant_update_subnetpool [0.424902s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnetpools_negative.SubnetPoolsNegativeTestJSON.test_update_subnetpool_tenant_id [0.199502s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_associate_address_scope [1.060711s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_associate_another_address_scope [0.922087s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_disassociate_address_scope [1.168384s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_prefixes_append [0.365154s] ... ok' - '{2} neutron_tempest_plugin.api.test_subnetpools.SubnetPoolsTestV6.test_update_subnetpool_prefixes_extend [0.317641s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_no_pagination_limit_0 [0.280983s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination [0.859780s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.382789s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.261022s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [1.561104s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_create_floatingip_with_timestamp [1.839366s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_create_router_with_timestamp [0.228321s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_show_floatingip_attribute_with_timestamp [0.921789s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_show_router_attribute_with_timestamp [0.409341s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_with_href_links [1.992014s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_update_floatingip_with_timestamp [0.838071s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_pagination_with_marker [1.194689s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_sorts_asc [0.120766s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_sorts_desc [0.131685s] ... ok' - '{0} neutron_tempest_plugin.api.test_subnets.SubnetsSearchCriteriaTest.test_list_validation_filters [0.149506s] ... ok' - '{2} neutron_tempest_plugin.api.test_timestamp.TestTimeStampWithL3.test_update_router_with_timestamp [2.867470s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.api.test_trunk.TrunkTestInheritJSONBase) ... SKIPPED: VLAN type_driver is not enabled' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_no_pagination_limit_0 [0.081539s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination [0.410396s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_asc [0.630073s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_desc [0.155979s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_page_reverse_with_href_links [0.620196s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_with_href_links [1.608244s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_pagination_with_marker [1.111932s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_sorts_asc [0.095744s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk.TrunksSearchCriteriaTest.test_list_sorts_desc [0.090578s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_floatingip.DefaultSnatToExternal.test_snat_external_ip [83.958680s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_empty_trunk_details [4.348573s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_trunk_details_no_subports [5.479316s] ... ok' - '{2} neutron_tempest_plugin.api.test_trunk_details.TestTrunkDetailsJSON.test_port_resource_trunk_details_with_subport [9.985204s] ... ok' - '{3} neutron_tempest_plugin.scenario.test_floatingip.FloatingIPPortDetailsTest.test_floatingip_port_details [117.465087s] ... ok' - '{0} neutron_tempest_plugin.scenario.admin.test_floatingip.FloatingIpTestCasesAdmin.test_two_vms_fips [82.426212s] ... ok' - '{3} setUpClass (neutron_tempest_plugin.scenario.test_local_ip.LocalIPTest) ... SKIPPED: Skipped because network extension: local_ip is not enabled' - '{0} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationAdminTests) ... SKIPPED: DNSIntegrationAdminTests skipped as designate is not available' - '{0} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationExtraTests) ... SKIPPED: DNSIntegrationExtraTests skipped as designate is not available' - '{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_1 [83.416657s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_basic.NetworkBasicTest.test_basic_instance [82.653480s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_basic.NetworkBasicTest.test_ping_global_ip_from_vm_with_fip ... SKIPPED: Global IP address is not defined.' - '{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_dvr_and_no_dvr_routers_in_same_subnet ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_2 [89.051591s] ... ok' - '{0} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpMultipleRoutersTest.test_reuse_ip_address_with_other_fip_on_other_router [153.557620s] ... ok' - '{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_attach_previously_used_port_to_new_instance [163.189172s] ... ok' - '{0} setUpClass (neutron_tempest_plugin.scenario.test_mac_learning.MacLearningTest) ... SKIPPED: This test requires advanced tools to be executed' - '{0} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromDVR) ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{0} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromHA) ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_router_east_west_traffic [120.918807s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_3 [123.110325s] ... ok' - '{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_create_instance_using_network_with_existing_policy [99.126227s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_connectivity.NetworkConnectivityTest.test_connectivity_through_2_routers [99.471004s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSeparateNetwork.test_east_west_4 [79.073514s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_dhcp.DHCPPortUpdateTest.test_modify_dhcp_port_ip_address ... SKIPPED: OVN driver is required to run this test - LP#1942794 solution only applied to OVN' - '{3} neutron_tempest_plugin.scenario.test_qos.QoSTest.test_qos_basic_and_update [118.375390s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_dhcp.DHCPTest.test_extra_dhcp_opts [68.980035s] ... ok' - '{2} setUpClass (neutron_tempest_plugin.scenario.test_dns_integration.DNSIntegrationDomainPerProjectTests) ... SKIPPED: DNSIntegrationDomainPerProjectTests skipped as designate is not available' - '{1} neutron_tempest_plugin.scenario.test_ipv6.IPv6Test.test_ipv6_hotplug_dhcpv6stateless [80.165040s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_ipv6.IPv6Test.test_ipv6_hotplug_slaac [79.753879s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIPQosTest.test_qos [99.672187s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_metadata.MetadataTest.test_metadata_routed ... SKIPPED: Advanced image is required to run this test.' - '{1} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromDVRHA) ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{1} setUpClass (neutron_tempest_plugin.scenario.test_migration.NetworkMigrationFromLegacy) ... SKIPPED: Skipped because network extension: dvr is not enabled' - '{1} setUpClass (neutron_tempest_plugin.scenario.test_mtu.NetworkMtuTest) ... SKIPPED: GRE or VXLAN type_driver is not enabled' - '{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_1 [89.453921s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_editing_and_deleting_tcp_rule [102.578268s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_2 [92.066310s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_editing_and_deleting_udp_rule [105.564427s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_3 [140.558136s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_to_2_fixed_ips [107.668195s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_floatingip.FloatingIpSameNetwork.test_east_west_4 [74.839647s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_port_forwardings.PortForwardingTestJSON.test_port_forwarding_to_2_servers [151.075935s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_floatingip.TestFloatingIPUpdate.test_floating_ip_update [98.221068s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_ports.PortsTest.test_port_with_fixed_ip [46.551135s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_internal_dns.InternalDNSTest.test_create_and_update_port_with_dns_name [101.263655s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_ports.PortsTest.test_previously_used_port [119.581036s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_internal_dns.InternalDNSTest.test_dns_domain_and_name [177.965972s] ... ok' - '{2} setUpClass (neutron_tempest_plugin.scenario.test_multicast.MulticastTestIPv4) ... SKIPPED: This test require advanced tools for this test' - '{1} neutron_tempest_plugin.scenario.test_security_groups.StatelessSecGroupDualStackDHCPv6StatelessTest.test_default_sec_grp_scenarios [187.850287s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_parent_port_connectivity_after_trunk_deleted_lb ... SKIPPED: Advanced image is required to run this test.' - '{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_subport_connectivity ... SKIPPED: Advanced image is required to run this test.' - '{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_subport_connectivity_soft_reboot ... SKIPPED: Advanced image is required to run this test.' - '{2} neutron_tempest_plugin.scenario.test_portsecurity.PortSecurityTest.test_port_security_removed_added_stateful_sg [90.152993s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_portsecurity.PortSecurityTest.test_port_security_removed_added_stateless_sg [98.417694s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_trunk_subport_lifecycle [172.978105s] ... ok' - '{1} neutron_tempest_plugin.scenario.test_trunk.TrunkTest.test_trunk_vm_migration ... SKIPPED: Less than 2 compute nodes, skipping multinode tests.' - '{1} setUpClass (neutron_tempest_plugin.scenario.test_vlan_transparency.VlanTransparencyTest) ... SKIPPED: vlan-transparent extension not enabled.' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_connectivity_between_vms_using_different_sec_groups [151.800634s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_default_sec_grp_scenarios [157.572479s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_fragmented_traffic_is_accepted ... SKIPPED: Advanced image is required to run this test.' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_ip_prefix [90.172081s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_ip_prefix_negative [140.796251s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_multiple_ports_portrange_remote [232.085843s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_multiple_ports_secgroup_inheritance [76.417326s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_overlapping_sec_grp_rules [250.677933s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_packets_of_any_connection_state_can_reach_dest ... SKIPPED: Advanced image is required to run this test.' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_protocol_number_rule [112.017830s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_reattach_sg_with_changed_mode [44.339067s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remote_group [141.707074s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remote_group_and_remote_address_group ... SKIPPED: Openvswitch agent is required to run this test' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_remove_sec_grp_from_active_vm [110.616211s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessNetworkSecGroupIPv4Test.test_two_sec_groups [44.400442s] ... ok' - '{2} neutron_tempest_plugin.scenario.test_security_groups.StatelessSecGroupDualStackSlaacTest.test_default_sec_grp_scenarios [152.197966s] ... ok' - '' - ============================== - 'Failed 1 tests - output below:' - ============================== - '' - setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON) - '-------------------------------------------------------------------------' - '' - 'Captured traceback:' - ~~~~~~~~~~~~~~~~~~~ - ' Traceback (most recent call last):' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 185, in setUpClass' - ' raise value.with_traceback(trace)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 170, in setUpClass' - ' cls.setup_credentials()' - '' - ' File "/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py", line 117, in setup_credentials' - ' super(BaseNetworkTest, cls).setup_credentials()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 398, in setup_credentials' - ' manager = cls.get_client_manager(' - '' - ' File "/usr/lib/python3.9/site-packages/neutron_tempest_plugin/api/base.py", line 89, in get_client_manager' - ' manager = super(BaseNetworkTest, cls).get_client_manager(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/test.py", line 743, in get_client_manager' - ' creds = getattr(cred_provider, credentials_method)()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 473, in get_primary_creds' - ' return self.get_project_member_creds()' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 514, in get_project_member_creds' - ' return self.get_credentials([''member''], scope=''project'')' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 436, in get_credentials' - ' credentials = self._create_creds(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/dynamic_creds.py", line 200, in _create_creds' - ' project = self.creds_client.create_project(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/cred_client.py", line 164, in create_project' - ' project = self.projects_client.create_project(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/services/identity/v3/projects_client.py", line 37, in create_project' - ' resp, body = self.post(''projects'', post_body)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 314, in post' - ' resp_header, resp_body = self.request(' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 762, in request' - ' self._error_checker(resp, resp_body)' - '' - ' File "/usr/lib/python3.9/site-packages/tempest/lib/common/rest_client.py", line 946, in _error_checker' - ' raise exceptions.UnexpectedResponseCode(str(resp.status),' - '' - ' tempest.lib.exceptions.UnexpectedResponseCode: Unexpected response code received' - 'Details: 504' - '' - '' - '' - ====== - Totals - ====== - 'Ran: 696 tests in 4415.2383 sec.' - ' - Passed: 630' - ' - Skipped: 65' - ' - Expected Fail: 0' - ' - Unexpected Success: 0' - ' - Failed: 1' - 'Sum of execute time for each test: 6939.5956 sec.' - '' - ============== - Worker Balance - ============== - ' - Worker 0 (165 tests) => 0:18:13.217142' - ' - Worker 1 (161 tests) => 0:44:37.632479' - ' - Worker 2 (228 tests) => 1:13:35.024311' - ' - Worker 3 (142 tests) => 0:21:48.839706' - ~ / - / - Excluded tests - Included tests - ~/openshift / - Generate file containing failing tests - Generate subunit, then xml and html results - setUpClass (neutron_tempest_plugin.api.test_trunk_negative.TrunkTestJSON) - / 2026-01-22 18:07:46,835 p=35978 u=zuul n=ansible | ...ignoring 2026-01-22 18:07:46,846 p=35978 u=zuul n=ansible | TASK [tempest : Change tempest directory permission back to original path={{ cifmw_tempest_artifacts_basedir }}, state=directory, recurse=True, owner={{ ansible_user | default(lookup('env', 'USER')) }}, group={{ ansible_user | default(lookup('env', 'USER')) }}] *** 2026-01-22 18:07:46,846 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:46 +0000 (1:14:30.270) 1:16:00.341 ****** 2026-01-22 18:07:46,846 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:46 +0000 (1:14:30.270) 1:16:00.341 ****** 2026-01-22 18:07:47,111 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 18:07:47,123 p=35978 u=zuul n=ansible | TASK [tempest : Save logs from podman mode=0644, dest={{ cifmw_tempest_artifacts_basedir }}/podman_tempest.log, content="{{ tempest_run_output.stdout }}" ] *** 2026-01-22 18:07:47,123 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.277) 1:16:00.619 ****** 2026-01-22 18:07:47,123 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.277) 1:16:00.618 ****** 2026-01-22 18:07:47,633 p=35978 u=zuul n=ansible | changed: [localhost] 2026-01-22 18:07:47,643 p=35978 u=zuul n=ansible | TASK [tempest : Fail if podman container did not succeed that=['tempest_run_output.failed == false']] *** 2026-01-22 18:07:47,643 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.519) 1:16:01.138 ****** 2026-01-22 18:07:47,643 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.519) 1:16:01.138 ****** 2026-01-22 18:07:47,667 p=35978 u=zuul n=ansible | fatal: [localhost]: FAILED! => assertion: tempest_run_output.failed == false changed: false evaluated_to: false msg: Assertion failed 2026-01-22 18:07:47,668 p=35978 u=zuul n=ansible | NO MORE HOSTS LEFT ************************************************************* 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | PLAY RECAP ********************************************************************* 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | localhost : ok=37 changed=18 unreachable=0 failed=1 skipped=27 rescued=0 ignored=1 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.026) 1:16:01.165 ****** 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | =============================================================================== 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | tempest : Run tempest ------------------------------------------------ 4470.27s 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | tempest : Ensure we have tempest container image ----------------------- 48.57s 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | os_net_setup : Create subnets ------------------------------------------- 9.03s 2026-01-22 18:07:47,670 p=35978 u=zuul n=ansible | os_net_setup : Create networks ------------------------------------------ 7.09s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | os_net_setup : Create subnet pools -------------------------------------- 5.75s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | os_net_setup : Delete existing subnet pools ----------------------------- 5.64s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | os_net_setup : Delete existing subnets ---------------------------------- 3.62s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | os_net_setup : Delete existing networks --------------------------------- 3.25s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Ensure podman is installed ------------------------------------ 1.38s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Creating include.txt ------------------------------------------ 0.65s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Save logs from podman ----------------------------------------- 0.52s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Creating exclude.txt ------------------------------------------ 0.45s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Get clouds.yaml ----------------------------------------------- 0.45s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Create profile.yaml file -------------------------------------- 0.39s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Get keystone data --------------------------------------------- 0.39s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Get credentials data ------------------------------------------ 0.37s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Set proper permission for tempest directory ------------------- 0.30s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Change tempest directory permission back to original ---------- 0.28s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Create tempest directories ------------------------------------ 0.27s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest : Copy CA bundle to cifmw_tempest_artifacts_basedir ------------- 0.23s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | Thursday 22 January 2026 18:07:47 +0000 (0:00:00.027) 1:16:01.166 ****** 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | =============================================================================== 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | tempest -------------------------------------------------------------- 4525.12s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | os_net_setup ----------------------------------------------------------- 34.69s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | run_hook ---------------------------------------------------------------- 0.80s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | fdp_update_container_images --------------------------------------------- 0.24s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | fdp_update_edpm --------------------------------------------------------- 0.20s 2026-01-22 18:07:47,671 p=35978 u=zuul n=ansible | cifmw_setup ------------------------------------------------------------- 0.05s 2026-01-22 18:07:47,672 p=35978 u=zuul n=ansible | ansible.builtin.assert -------------------------------------------------- 0.03s 2026-01-22 18:07:47,672 p=35978 u=zuul n=ansible | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 2026-01-22 18:07:47,672 p=35978 u=zuul n=ansible | total ---------------------------------------------------------------- 4561.12s