Trigger GMR for Service manila Trigger GMR for Service cinder Trigger GMR for Nova services Will retrieve SOS reports from nodes crc Generating SOS Report for crc Journal size limit not set or invalid: ignoring Starting pod/crc-debug-289bp ... To use host binaries, run `chroot /host` Generating SOS Report for EDPM compute-0 Generating SOS Report for EDPM compute-1 Generating SOS Report for EDPM compute-2 Warning: Permanently added '192.168.122.100' (ED25519) to the list of known hosts. Warning: Permanently added '192.168.122.101' (ED25519) to the list of known hosts. Warning: Permanently added '192.168.122.102' (ED25519) to the list of known hosts. OMC mode: Collecting OLM resources (subscriptions, CSVs, etc.) in OMC format sos report (version 4.10.1) sos report (version 4.10.1) +--------------------------------------+----------------+--------------------------------+----------+---------+-------+----------------------------+ | ID | Binary | Host | Zone | Status | State | Updated At | +--------------------------------------+----------------+--------------------------------+----------+---------+-------+----------------------------+ | 60b87f02-ee1f-4548-9643-f3ced1d04e14 | nova-conductor | nova-cell0-conductor-0 | internal | enabled | up | 2026-01-21T16:45:17.000000 | | 43db67db-2af8-482a-959c-92607136721d | nova-scheduler | nova-scheduler-0 | internal | enabled | up | 2026-01-21T16:45:19.000000 | | 47eb95c8-2531-42ef-b577-07476201f8ed | nova-conductor | nova-cell1-conductor-0 | internal | enabled | up | 2026-01-21T16:45:19.000000 | | b1508e53-ebd1-45ba-93cd-2acdacf074f0 | nova-compute | compute-2.ctlplane.example.com | nova | enabled | up | 2026-01-21T16:45:19.000000 | | f960326d-4f19-4816-8acf-919b8bf95647 | nova-compute | compute-1.ctlplane.example.com | nova | enabled | up | 2026-01-21T16:45:20.000000 | | 33d87a18-fb9b-42f3-b6b5-9ebee0fa1aec | nova-compute | compute-0.ctlplane.example.com | nova | enabled | up | 2026-01-21T16:45:19.000000 | +--------------------------------------+----------------+--------------------------------+----------+---------+-------+----------------------------+ sos report (version 4.10.1) This command will collect diagnostic and configuration information from this CentOS Linux system and installed applications. An archive containing the collected information will be generated in /var/tmp/sos-osp/sos.m3ia63pu and may be provided to a CentOS support representative. Any information provided to CentOS will be treated in accordance with the published support policies at: Community Website : https://www.centos.org/ The generated archive may contain data considered sensitive and its content should be reviewed by the originating organization before being passed to any third party. No changes will be made to system configuration. Setting up archive ... Setting up plugins ... This command will collect diagnostic and configuration information from this CentOS Linux system and installed applications. An archive containing the collected information will be generated in /var/tmp/sos-osp/sos._84exgs1 and may be provided to a CentOS support representative. Any information provided to CentOS will be treated in accordance with the published support policies at: Community Website : https://www.centos.org/ The generated archive may contain data considered sensitive and its content should be reviewed by the originating organization before being passed to any third party. No changes will be made to system configuration. Setting up archive ... Setting up plugins ... This command will collect diagnostic and configuration information from this CentOS Linux system and installed applications. An archive containing the collected information will be generated in /var/tmp/sos-osp/sos.s48igoce and may be provided to a CentOS support representative. Any information provided to CentOS will be treated in accordance with the published support policies at: Community Website : https://www.centos.org/ The generated archive may contain data considered sensitive and its content should be reviewed by the originating organization before being passed to any third party. No changes will be made to system configuration. Setting up archive ... Setting up plugins ... Gathering data for ns/openstack... Wrote inspect data to /must-gather. [plugin:networking] skipped command 'ip -s macsec show': required kmods missing: macsec. Use '--allow-system-changes' to enable collection. [plugin:networking] skipped command 'ss -peaonmi': required kmods missing: xsk_diag. Use '--allow-system-changes' to enable collection. [plugin:openstack_neutron] Could not open conf file /etc/neutron/plugins/ml2/ml2_conf.ini: [Errno 2] No such file or directory: '/etc/neutron/plugins/ml2/ml2_conf.ini' Not all environment variables set. Source the environment file for the user intended to connect to the OpenStack environment. [plugin:systemd] skipped command 'systemd-resolve --status': required services missing: systemd-resolved. [plugin:systemd] skipped command 'systemd-resolve --statistics': required services missing: systemd-resolved. Running plugins. Please wait ... Starting 3/82 ata [Running: ata] Starting 1/82 anaconda [Running: anaconda] Starting 2/82 anacron [Running: anaconda anacron] Starting 4/82 auditd [Running: anaconda anacron auditd] Starting 5/82 block [Running: anaconda anacron auditd block] Starting 6/82 boot [Running: anaconda auditd block boot] Starting 7/82 buildah [Running: auditd block boot buildah] Starting 8/82 ceph_common [Running: block boot buildah ceph_common] Starting 9/82 ceph_mds [Running: block boot buildah ceph_mds] [plugin:networking] skipped command 'ip -s macsec show': required kmods missing: macsec. Use '--allow-system-changes' to enable collection. [plugin:networking] skipped command 'ss -peaonmi': required kmods missing: xsk_diag. Use '--allow-system-changes' to enable collection. [plugin:openstack_neutron] Could not open conf file /etc/neutron/plugins/ml2/ml2_conf.ini: [Errno 2] No such file or directory: '/etc/neutron/plugins/ml2/ml2_conf.ini' Not all environment variables set. Source the environment file for the user intended to connect to the OpenStack environment. [plugin:networking] skipped command 'ip -s macsec show': required kmods missing: macsec. Use '--allow-system-changes' to enable collection. [plugin:networking] skipped command 'ss -peaonmi': required kmods missing: xsk_diag. Use '--allow-system-changes' to enable collection. Starting 10/82 ceph_mgr [Running: block boot ceph_mds ceph_mgr] [plugin:openstack_neutron] Could not open conf file /etc/neutron/plugins/ml2/ml2_conf.ini: [Errno 2] No such file or directory: '/etc/neutron/plugins/ml2/ml2_conf.ini' Not all environment variables set. Source the environment file for the user intended to connect to the OpenStack environment. Starting 11/82 ceph_mon [Running: boot ceph_mds ceph_mgr ceph_mon] [plugin:systemd] skipped command 'systemd-resolve --status': required services missing: systemd-resolved. [plugin:systemd] skipped command 'systemd-resolve --statistics': required services missing: systemd-resolved. [plugin:systemd] skipped command 'systemd-resolve --status': required services missing: systemd-resolved. [plugin:systemd] skipped command 'systemd-resolve --statistics': required services missing: systemd-resolved. Running plugins. Please wait ... Starting 4/82 auditd [Running: auditd] Starting 3/82 ata [Running: auditd ata] Starting 2/82 anacron [Running: auditd anacron] Starting 5/82 block [Running: auditd anacron block] Starting 1/82 anaconda [Running: auditd anacron block anaconda] Starting 6/82 boot [Running: auditd block anaconda boot] Starting 7/82 buildah [Running: auditd block boot buildah] Starting 8/82 ceph_common [Running: block boot buildah ceph_common] Running plugins. Please wait ... Starting 3/82 ata [Running: ata] Starting 1/82 anaconda [Running: anaconda] Starting 2/82 anacron [Running: anaconda anacron] Starting 5/82 block [Running: anaconda anacron block] Starting 4/82 auditd [Running: anaconda anacron block auditd] Starting 6/82 boot [Running: anaconda block auditd boot] Starting 9/82 ceph_mds [Running: block boot buildah ceph_mds] Starting 7/82 buildah [Running: block auditd boot buildah] Starting 8/82 ceph_common [Running: block boot buildah ceph_common] Starting 9/82 ceph_mds [Running: block boot buildah ceph_mds] Starting 10/82 ceph_mgr [Running: block boot ceph_mds ceph_mgr] Starting 11/82 ceph_mon [Running: boot ceph_mds ceph_mgr ceph_mon] Starting 10/82 ceph_mgr [Running: block boot ceph_mds ceph_mgr] Starting 11/82 ceph_mon [Running: boot ceph_mds ceph_mgr ceph_mon] Starting 12/82 ceph_osd [Running: boot ceph_mgr ceph_mon ceph_osd] Starting 12/82 ceph_osd [Running: boot ceph_mgr ceph_mon ceph_osd] Starting 12/82 ceph_osd [Running: boot ceph_mgr ceph_mon ceph_osd] Starting 13/82 ceph_rgw [Running: ceph_mgr ceph_mon ceph_osd ceph_rgw] Starting 14/82 cgroups [Running: ceph_mgr ceph_mon ceph_osd cgroups] Starting 13/82 ceph_rgw [Running: ceph_mgr ceph_mon ceph_osd ceph_rgw] Trying to pull registry.redhat.io/rhel9/support-tools:latest... Starting 13/82 ceph_rgw [Running: ceph_mgr ceph_mon ceph_osd ceph_rgw] Starting 15/82 chrony [Running: ceph_mgr ceph_mon cgroups chrony] Getting image source signatures Starting 16/82 console [Running: ceph_mgr ceph_mon cgroups console] Checking if image destination supports signatures Copying blob sha256:b98104ab0e1239a911fc1ca3c8589101c7fa3eb521b2c4b1fb1120038f55fbe9 Copying blob sha256:34b5c851d9cf523f162ceb72c260f1c6d1e556f8f4422e15258572766f2afc28 Starting 17/82 containers_common [Running: ceph_mgr ceph_mon cgroups containers_common] Starting 14/82 cgroups [Running: ceph_mgr ceph_mon ceph_osd cgroups] Starting 18/82 coredump [Running: ceph_mgr ceph_mon cgroups coredump] Starting 19/82 cron [Running: ceph_mgr ceph_mon cgroups cron] Starting 20/82 crypto [Running: ceph_mgr ceph_mon cgroups crypto] Starting 21/82 dbus [Running: ceph_mgr ceph_mon cgroups dbus] Starting 22/82 devicemapper [Running: ceph_mgr ceph_mon cgroups devicemapper] Starting 23/82 devices [Running: ceph_mgr ceph_mon cgroups devices] Starting 14/82 cgroups [Running: ceph_mgr ceph_mon ceph_osd cgroups] Starting 24/82 dnf [Running: ceph_mgr ceph_mon cgroups dnf] Starting 15/82 chrony [Running: ceph_mgr ceph_mon cgroups chrony] Starting 16/82 console [Running: ceph_mgr ceph_mon cgroups console] Starting 17/82 containers_common [Running: ceph_mgr ceph_mon cgroups containers_common] Starting 18/82 coredump [Running: ceph_mgr ceph_mon cgroups coredump] Starting 19/82 cron [Running: ceph_mgr ceph_mon cgroups cron] Starting 20/82 crypto [Running: ceph_mgr ceph_mon cgroups crypto] Starting 21/82 dbus [Running: ceph_mgr ceph_mon cgroups dbus] Starting 22/82 devicemapper [Running: ceph_mgr ceph_mon cgroups devicemapper] Starting 23/82 devices [Running: ceph_mgr ceph_mon cgroups devices] Starting 25/82 filesys [Running: ceph_mgr ceph_mon dnf filesys] Starting 15/82 chrony [Running: ceph_mgr ceph_mon cgroups chrony] Starting 24/82 dnf [Running: ceph_mgr ceph_mon cgroups dnf] Starting 26/82 firewall_tables [Running: ceph_mgr ceph_mon dnf firewall_tables] Starting 16/82 console [Running: ceph_mgr ceph_mon cgroups console] Starting 27/82 hardware [Running: ceph_mgr ceph_mon dnf hardware] Starting 17/82 containers_common [Running: ceph_mgr ceph_mon cgroups containers_common] Starting 18/82 coredump [Running: ceph_mgr ceph_mon cgroups coredump] Starting 19/82 cron [Running: ceph_mgr ceph_mon cgroups cron] Starting 20/82 crypto [Running: ceph_mgr ceph_mon cgroups crypto] Starting 21/82 dbus [Running: ceph_mgr ceph_mon cgroups dbus] Starting 22/82 devicemapper [Running: ceph_mgr ceph_mon cgroups devicemapper] Starting 23/82 devices [Running: ceph_mgr ceph_mon cgroups devices] Starting 24/82 dnf [Running: ceph_mgr ceph_mon cgroups dnf] Starting 28/82 host [Running: ceph_mgr ceph_mon dnf host] Starting 25/82 filesys [Running: ceph_mgr ceph_mon dnf filesys] Starting 29/82 i18n [Running: ceph_mgr ceph_mon dnf i18n] Starting 30/82 iscsi [Running: ceph_mgr ceph_mon dnf iscsi] Starting 26/82 firewall_tables [Running: ceph_mgr ceph_mon dnf firewall_tables] Starting 31/82 kdump [Running: ceph_mgr ceph_mon dnf kdump] Copying config sha256:907c6f8a1bbc29560332663e8e9c85244e317088310a19891a847689ebec5226 Writing manifest to image destination Storing signatures Starting 27/82 hardware [Running: ceph_mgr ceph_mon dnf hardware] 907c6f8a1bbc29560332663e8e9c85244e317088310a19891a847689ebec5226 .toolboxrc file detected, overriding defaults... Checking if there is a newer version of registry.redhat.io/rhel9/support-tools available... Starting 25/82 filesys [Running: ceph_mgr ceph_mon dnf filesys] Starting 26/82 firewall_tables [Running: ceph_mgr ceph_mon dnf firewall_tables] Starting 28/82 host [Running: ceph_mgr ceph_mon dnf host] Starting 27/82 hardware [Running: ceph_mgr ceph_mon dnf hardware] Starting 29/82 i18n [Running: ceph_mgr ceph_mon dnf i18n] Starting 30/82 iscsi [Running: ceph_mgr ceph_mon dnf iscsi] Starting 31/82 kdump [Running: ceph_mgr ceph_mon dnf kdump] Starting 32/82 kernel [Running: ceph_mgr ceph_mon dnf kernel] Starting 28/82 host [Running: ceph_mgr ceph_mon dnf host] Starting 29/82 i18n [Running: ceph_mgr ceph_mon dnf i18n] Starting 30/82 iscsi [Running: ceph_mgr ceph_mon dnf iscsi] Spawning a container 'toolbox-osp' with image 'registry.redhat.io/rhel9/support-tools' Detected RUN label in the container image. Using that as the default... Starting 31/82 kdump [Running: ceph_mgr ceph_mon dnf kdump] 9387d4441da99ee7c7cf906533513b3e1ba993fba418e7c508b666d73f1987c5 Gathering data for ns/metallb-system... toolbox-osp sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=/var/tmp/sos-osp; exit Starting 32/82 kernel [Running: ceph_mgr ceph_mon dnf kernel] Starting 33/82 keyutils [Running: ceph_mgr ceph_mon dnf keyutils] Starting 34/82 krb5 [Running: ceph_mgr ceph_mon dnf krb5] Starting 35/82 kvm [Running: ceph_mgr ceph_mon dnf kvm] Starting 32/82 kernel [Running: ceph_mgr ceph_mon dnf kernel] Starting 33/82 keyutils [Running: ceph_mgr ceph_mon dnf keyutils] Starting 34/82 krb5 [Running: ceph_mgr ceph_mon dnf krb5] Starting 35/82 kvm [Running: ceph_mgr ceph_mon dnf kvm] Starting 36/82 ldap [Running: ceph_mon dnf kvm ldap] Starting 37/82 libraries [Running: ceph_mon dnf kvm libraries] Starting 38/82 libvirt [Running: ceph_mon dnf kvm libvirt] Starting 39/82 login [Running: ceph_mon dnf kvm login] Starting 40/82 logrotate [Running: ceph_mon dnf kvm logrotate] Starting 41/82 logs [Running: ceph_mon dnf kvm logs] Starting 33/82 keyutils [Running: ceph_mgr ceph_mon dnf keyutils] Starting 34/82 krb5 [Running: ceph_mgr ceph_mon dnf krb5] Starting 35/82 kvm [Running: ceph_mgr ceph_mon dnf kvm] Starting 42/82 lvm2 [Running: ceph_mon dnf logs lvm2] Starting 36/82 ldap [Running: ceph_mon dnf kvm ldap] Starting 37/82 libraries [Running: ceph_mon dnf kvm libraries] Starting 38/82 libvirt [Running: ceph_mon dnf kvm libvirt] Starting 43/82 md [Running: ceph_mon dnf logs md] Starting 39/82 login [Running: ceph_mon dnf kvm login] Starting 44/82 memory [Running: ceph_mon dnf logs memory] Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Starting 45/82 multipath [Running: ceph_mon dnf logs multipath] Starting 40/82 logrotate [Running: ceph_mon dnf kvm logrotate] Starting 46/82 networking [Running: ceph_mon dnf logs networking] Starting 41/82 logs [Running: ceph_mon dnf kvm logs] [?2004l temporary directory /var/tmp/sos-osp does not exist or is not writable exit Starting 42/82 lvm2 [Running: ceph_mon dnf logs lvm2] tar: Removing leading `/' from member names tar: /var/log/pods/*/*.log.*: Warning: Cannot stat: No such file or directory Starting 43/82 md [Running: ceph_mon dnf logs md] Starting 44/82 memory [Running: ceph_mon dnf logs memory] Starting 47/82 networkmanager [Running: ceph_mon logs networking networkmanager] Starting 45/82 multipath [Running: ceph_mon dnf logs multipath] Starting 46/82 networking [Running: ceph_mon dnf logs networking] Starting 36/82 ldap [Running: ceph_mon dnf kvm ldap] Starting 37/82 libraries [Running: ceph_mon dnf kvm libraries] Starting 38/82 libvirt [Running: ceph_mon dnf kvm libvirt] Starting 39/82 login [Running: ceph_mon dnf kvm login] Starting 48/82 nfs [Running: ceph_mon networking networkmanager nfs] Wrote inspect data to /must-gather. Starting 40/82 logrotate [Running: ceph_mon dnf kvm logrotate] Starting 41/82 logs [Running: ceph_mon dnf kvm logs] Starting 49/82 numa [Running: ceph_mon networking networkmanager numa] Starting 42/82 lvm2 [Running: ceph_mon dnf logs lvm2] Starting 50/82 nvme [Running: ceph_mon networking networkmanager nvme] Starting 51/82 openhpi [Running: ceph_mon networking networkmanager openhpi] Starting 52/82 openstack_edpm [Running: ceph_mon networking networkmanager openstack_edpm] Starting 53/82 openstack_neutron [Running: ceph_mon networking networkmanager openstack_neutron] Starting 54/82 openstack_nova [Running: ceph_mon networking networkmanager openstack_nova] Wrote inspect data to /must-gather. Starting 55/82 openvswitch [Running: ceph_mon networking networkmanager openvswitch] Starting 43/82 md [Running: ceph_mon dnf logs md] Starting 44/82 memory [Running: ceph_mon dnf logs memory] Starting 45/82 multipath [Running: ceph_mon dnf logs multipath] Wrote inspect data to /must-gather. Gathering secrets in namespace openstack Starting 46/82 networking [Running: ceph_mon dnf logs networking] Starting 47/82 networkmanager [Running: ceph_mon logs networking networkmanager] Wrote inspect data to /must-gather. Starting 56/82 ovn_host [Running: ceph_mon networking openvswitch ovn_host] Starting 57/82 pam [Running: ceph_mon networking openvswitch pam] Starting 48/82 nfs [Running: ceph_mon networking networkmanager nfs] Starting 49/82 numa [Running: ceph_mon networking networkmanager numa] Starting 58/82 pci [Running: ceph_mon networking openvswitch pci] Starting 50/82 nvme [Running: ceph_mon networking networkmanager nvme] Starting 59/82 podman [Running: ceph_mon networking openvswitch podman] Starting 51/82 openhpi [Running: ceph_mon networking networkmanager openhpi] Starting 52/82 openstack_edpm [Running: ceph_mon networking networkmanager openstack_edpm] Starting 53/82 openstack_neutron [Running: ceph_mon networking networkmanager openstack_neutron] Starting 54/82 openstack_nova [Running: ceph_mon networking networkmanager openstack_nova] Starting 55/82 openvswitch [Running: ceph_mon networking networkmanager openvswitch] Starting 56/82 ovn_host [Running: ceph_mon networking openvswitch ovn_host] Starting 57/82 pam [Running: ceph_mon networking openvswitch pam] Starting 58/82 pci [Running: ceph_mon networking openvswitch pci] Starting 47/82 networkmanager [Running: ceph_mon logs networking networkmanager] Starting 59/82 podman [Running: ceph_mon networking openvswitch podman] Starting 60/82 process [Running: ceph_mon networking podman process] Starting 48/82 nfs [Running: ceph_mon networking networkmanager nfs] Starting 49/82 numa [Running: ceph_mon networking networkmanager numa] Starting 50/82 nvme [Running: ceph_mon networking networkmanager nvme] Starting 51/82 openhpi [Running: ceph_mon networking networkmanager openhpi] Starting 52/82 openstack_edpm [Running: ceph_mon networking networkmanager openstack_edpm] Starting 53/82 openstack_neutron [Running: ceph_mon networking networkmanager openstack_neutron] Starting 54/82 openstack_nova [Running: ceph_mon networking networkmanager openstack_nova] Starting 55/82 openvswitch [Running: ceph_mon networking networkmanager openvswitch] Starting 60/82 process [Running: ceph_mon networking podman process] Starting 61/82 processor [Running: ceph_mon networking process processor] Starting 56/82 ovn_host [Running: ceph_mon networking openvswitch ovn_host] Starting 57/82 pam [Running: ceph_mon networking openvswitch pam] Starting 58/82 pci [Running: ceph_mon networking openvswitch pci] Starting 59/82 podman [Running: ceph_mon networking openvswitch podman] Starting 62/82 python [Running: ceph_mon networking processor python] Starting 61/82 processor [Running: ceph_mon networking process processor] Starting 63/82 release [Running: ceph_mon processor python release] Starting 64/82 rpm [Running: ceph_mon processor python rpm] Starting 62/82 python [Running: ceph_mon process processor python] Starting 65/82 sar [Running: ceph_mon processor python sar] Starting 66/82 scsi [Running: ceph_mon processor python scsi] Starting 67/82 selinux [Running: ceph_mon processor scsi selinux] Starting 68/82 services [Running: ceph_mon processor selinux services] Starting 69/82 ssh [Running: ceph_mon processor selinux ssh] Starting 70/82 sudo [Running: ceph_mon processor selinux sudo] Starting 71/82 sunrpc [Running: ceph_mon processor selinux sunrpc] Starting 72/82 system [Running: ceph_mon processor selinux system] Starting 63/82 release [Running: ceph_mon processor python release] Starting 64/82 rpm [Running: ceph_mon processor python rpm] Starting 65/82 sar [Running: ceph_mon processor rpm sar] Starting 66/82 scsi [Running: ceph_mon processor rpm scsi] Starting 67/82 selinux [Running: ceph_mon processor rpm selinux] Starting 68/82 services [Running: ceph_mon processor selinux services] Starting 60/82 process [Running: ceph_mon networking podman process] Starting 69/82 ssh [Running: ceph_mon processor selinux ssh] Starting 70/82 sudo [Running: ceph_mon processor selinux sudo] Starting 71/82 sunrpc [Running: ceph_mon processor selinux sunrpc] Starting 72/82 system [Running: ceph_mon processor selinux system] Starting 73/82 systemd [Running: ceph_mon processor selinux systemd] Starting 61/82 processor [Running: ceph_mon podman process processor] Starting 73/82 systemd [Running: ceph_mon processor selinux systemd] Starting 62/82 python [Running: ceph_mon podman processor python] Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Starting 63/82 release [Running: ceph_mon processor python release] Starting 64/82 rpm [Running: ceph_mon processor python rpm] Starting 65/82 sar [Running: ceph_mon processor rpm sar] Starting 66/82 scsi [Running: ceph_mon processor rpm scsi] Starting 74/82 sysvipc [Running: ceph_mon processor systemd sysvipc] Starting 75/82 tpm2 [Running: ceph_mon processor systemd tpm2] Starting 76/82 tuned [Running: ceph_mon processor systemd tuned] Starting 67/82 selinux [Running: ceph_mon processor rpm selinux] Starting 68/82 services [Running: ceph_mon processor selinux services] Starting 69/82 ssh [Running: ceph_mon processor selinux ssh] Starting 70/82 sudo [Running: ceph_mon processor selinux sudo] Starting 71/82 sunrpc [Running: ceph_mon processor selinux sunrpc] Starting 72/82 system [Running: ceph_mon processor selinux system] Starting 77/82 udev [Running: ceph_mon processor systemd udev] Starting 78/82 unbound [Running: ceph_mon processor systemd unbound] Starting 79/82 vhostmd [Running: ceph_mon processor systemd vhostmd] Starting 80/82 virsh [Running: ceph_mon processor systemd virsh] Starting 74/82 sysvipc [Running: ceph_mon processor systemd sysvipc] Starting 75/82 tpm2 [Running: ceph_mon processor systemd tpm2] Starting 76/82 tuned [Running: ceph_mon processor systemd tuned] Wrote inspect data to /must-gather. Starting 77/82 udev [Running: ceph_mon processor systemd udev] Starting 78/82 unbound [Running: ceph_mon processor systemd unbound] Starting 79/82 vhostmd [Running: ceph_mon processor systemd vhostmd] Starting 80/82 virsh [Running: ceph_mon processor systemd virsh] Starting 81/82 xen [Running: ceph_mon processor virsh xen] Finishing plugins [Running: ceph_mon processor virsh] Starting 82/82 xfs [Running: ceph_mon processor virsh xfs] Starting 73/82 systemd [Running: ceph_mon processor selinux systemd] Finishing plugins [Running: ceph_mon processor virsh] Finishing plugins [Running: ceph_mon processor] Starting 81/82 xen [Running: ceph_mon processor virsh xen] Finishing plugins [Running: ceph_mon processor virsh] Starting 82/82 xfs [Running: ceph_mon processor virsh xfs] Finishing plugins [Running: ceph_mon] Finishing plugins [Running: ceph_mon processor xfs] Finishing plugins [Running: ceph_mon processor] Finished running plugins Finishing plugins [Running: ceph_mon] Finished running plugins Starting 74/82 sysvipc [Running: ceph_mon processor systemd sysvipc] Starting 75/82 tpm2 [Running: ceph_mon processor systemd tpm2] Starting 76/82 tuned [Running: ceph_mon processor systemd tuned] Starting 77/82 udev [Running: ceph_mon processor systemd udev] Starting 78/82 unbound [Running: ceph_mon processor systemd unbound] Starting 79/82 vhostmd [Running: ceph_mon processor systemd vhostmd] Starting 80/82 virsh [Running: ceph_mon processor systemd virsh] Starting 81/82 xen [Running: processor systemd virsh xen] Finishing plugins [Running: processor systemd virsh] Starting 82/82 xfs [Running: processor systemd virsh xfs] Finishing plugins [Running: processor systemd virsh] Finishing plugins [Running: processor virsh] Finishing plugins [Running: virsh] Finished running plugins tar: Removing leading `/' from hard link targets Removing debug pod ... Retrieving SOS Report for crc Starting pod/crc-debug-kh6tt ... To use host binaries, run `chroot /host` Removing debug pod ... Gathering data for ns/openstack-operators... Wrote inspect data to /must-gather. Gathering secrets in namespace openstack-operators Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Gathering data for ns/openshift-machine-api... Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering secrets in namespace openshift-machine-api Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Creating compressed archive... Your sos report has been generated and saved in: /var/tmp/sos-osp/sosreport-compute-2-2026-01-21-jftddnc.tar.xz Size 24.18MiB Owner root sha256 9af691929860b9280d6663abe2e1e288e72f61ffd0e821cf9c670943a6a8cda2 Please send this file to your support representative. Retrieving SOS Report for compute-2 Creating compressed archive... Your sos report has been generated and saved in: /var/tmp/sos-osp/sosreport-compute-1-2026-01-21-tpgcxxk.tar.xz Size 22.89MiB Owner root sha256 cec1797b987a1593f3449b4a9cd7288c287ed816bb866a06347d014d6391264a Please send this file to your support representative. Retrieving SOS Report for compute-1 Finished retrieving SOS Report for compute-2 Finished retrieving SOS Report for compute-1 Creating compressed archive... Your sos report has been generated and saved in: /var/tmp/sos-osp/sosreport-compute-0-2026-01-21-tgsaraf.tar.xz Size 27.05MiB Owner root sha256 8dc105031b3d68e3656fcf44dad14207040c48765d7a255177840de18cf007bc Please send this file to your support representative. Retrieving SOS Report for compute-0 Finished retrieving SOS Report for compute-0 Gathering data for ns/cert-manager... Gathering secrets in namespace cert-manager Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering data for ns/openshift-nmstate... Gathering secrets in namespace openshift-nmstate Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ W0121 16:47:24.327590 6323 util.go:195] skipping , failed to read event err: Object 'Kind' is missing in '' Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering data for ns/openshift-operators... Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Gathering secrets in namespace openshift-operators Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering data for ns/metallb-system... Gathering secrets in namespace metallb-system Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Gathering data for ns/openshift-marketplace... Gathering secrets in namespace openshift-marketplace Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Gathering data for ns/openshift-operators... Gathering secrets in namespace openshift-operators Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering data for ns/metallb-system... Gathering data for ns/cert-manager... Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Gathering data for ns/openstack-operators... Wrote inspect data to /must-gather. Warning: apps.openshift.io/v1 DeploymentConfig is deprecated in v4.14+, unavailable in v4.10000+ Wrote inspect data to /must-gather. Wrote inspect data to /must-gather. Gathering data for ns/cert-manager... Wrote inspect data to /must-gather. Gathering data for ns/openshift-machine-api... Wrote inspect data to /must-gather. Gathering data for ns/openstack-operators... Gathering data for ns/openshift-nmstate... Wrote inspect data to /must-gather. Gathering data for ns/openshift-monitoring... Gathering data for ns/openshift-multus... Wrote inspect data to /must-gather. Gather ctlplane service info: openstack Gather ctlplane service info: ovn Gather ctlplane service info: rabbitmq Copying OVN NB database from ovsdbserver-nb-0 Copying OVN SB database from ovsdbserver-sb-0 Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "ovsdbserver-nb" out of: ovsdbserver-nb, openstack-network-exporter Defaulted container "ovsdbserver-sb" out of: ovsdbserver-sb, openstack-network-exporter tar: Removing leading `/' from member names tar: Removing leading `/' from member names Gather ctlplane service info: keystone Gather ctlplane service info: glance Gather ctlplane service info: manila Gather ctlplane service info: cinder Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Gather ctlplane service info: nova Gather ctlplane service info: placement Defaulted container "rabbitmq" out of: rabbitmq, setup-container (init) Gather ctlplane service info: neutron Gather ctlplane service info: swift Gather ctlplane service info: barbican Gather ctlplane service info: ceilometer Extensions list not supported by Identity API tar: Removing leading `/' from member names The /must-gather/must-gather.tar.xz now can be attached to the support case.